Introduction to Computer Network Speed

Understanding the factors that determine performance of a computer network

Together with basic functionality and reliability, the performance of a computer network determines its overall usefulness. Network speed involves a combination of interrelated factors.

Busy traffic on a highway.

What Is Network Speed?

You want your networks to run fast in all situations. In some cases, a network delay may last only a few milliseconds and have a negligible impact on what you're doing. In other cases, network delays can cause severe slowdowns. Typical scenarios that are especially sensitive to network speed issues include

  • time to establish a new connection
  • time to load a web page
  • time to download an app, operating system patch, or other files
  • ability to stream video content for long periods without glitches

The Role of Bandwidth in Network Performance

Bandwidth is a key factor in determining the speed of a computer network. Providers prominently feature the bandwidth ratings of their internet service in product advertisements, so you probably know how much you have and what your network router can handle.

Bandwidth in computer networking refers to the data rate supported by a network connection or interface. It represents the overall capacity of the connection. The greater the capacity, the more likely that better performance will result. 

Bandwidth refers to both theoretical ratings and actual throughput, and it's important to distinguish between the two. For example, a standard 802.11g Wi-Fi connection offers 54 Mbps of rated bandwidth, but in practice, it achieves only 50% or less of this number.

Traditional Ethernet networks theoretically support 100 Mbps or 1000 Mbps of maximum bandwidth, but they can't reasonably achieve this maximum amount. Cellular (mobile) networks generally do not claim any specific bandwidth rating, but the same principle applies. Communications overheads in the computer hardware, network protocols, and operating systems drive the difference between theoretical bandwidth and actual throughput.

Measuring Network Bandwidth

Bandwidth is the amount of data that passes through a network connection over time as measured in bits per second (bps). Numerous tools exist for administrators to measure the bandwidth of network connections. On LANs (local area networks), these tools include Netperf and Test TCP. On the Internet, numerous bandwidth and speed test programs exist, and most are free for you to use.

Even with these tools at your disposal, bandwidth utilization is difficult to measure precisely as it varies over time depending on the configuration of hardware plus characteristics of software applications, including how they are being used.

About Broadband Speeds

The term "high bandwidth" usually distinguishes faster broadband Internet connections from traditional dial-up or cellular network speeds. Definitions of "high" versus "low" bandwidth vary, and they've changed over the years as network technology improved.

In 2015, the U.S. Federal Communications Commission (FCC) updated their definition of broadband to be those connections rated at least 25 Mbps for downloads and at least 3 Mbps for uploads. These numbers reflected a sharp increase from the FCC's previous minimums of 4 Mbps up and 1 Mbps down.

Bandwidth is not the only factor that contributes to the perceived speed of a network. A lesser-known element of network performance - ​latency - also plays an important role.

Latency in Broadband Speeds

Latency, which can show up in some speed tests as "ping," is the time it takes data to transmit from your computer to a server and back. You measure it in milliseconds. A good ping is under 10 ms. One over 100 ms can cause issues, however, especially when you're streaming a movie or playing a game online. High latency can cause buffering, stuttering, and slowdown (or "lag") that affects performance.

Was this page helpful?