Shopping Product Reviews

What is the relationship between bandwidth and latency?

So what is the relationship between bandwidth and latency? If your internet connection speed has adequate bandwidth, why is latency slowing it down? Or does it? How exactly does latency affect the internet? These are just a few of the common questions asked … what follows are some answers in both technical and straightforward terms.

Latency is the time it takes for your data (packets) to get from point A (your home / modem) to point B (the destination). Latency occurs due to each of the “stops” your data has to make on the way to point B. These stops, called hops, are the different routers and, in some cases, servers across the Internet that handle and route traffic accordingly. The more hops that are added to your data path, the higher your latency. The further away from point B, the higher latency is typically experienced, simply because there is more distance and hops encountered. Also, each of these hops can also be busy, so to speak, therefore the busier they are, the longer it will take for them to respond to your traffic requests, hence higher latency.

Most file transfers over the Internet use TCP / IP. The receiver constantly sends messages to the sender (ACKS) letting him know that everything is ready or, if not, which packets should be forwarded. If the channel has high latency, this reverse communication takes too long, causing the transmitter to stop sending until ACKS are received.

TCP also has a slow start mechanism. Sender has no idea of ​​end-to-end channel capacity. A slow start is designed to avoid overwhelming slower intermediate links.

Basically your bandwidth is the speed between you and your ISP, anything outside of that your ISP has no control.

Actually, latency may or may not be an issue. Because latency is the delay between getting information from point A to B, it is a much bigger problem in interactive applications than in large transfers.

With large transfers, if your bandwidth is sufficient, reliable, and configured correctly, you won’t notice many latency issues with high-latency connections. Once the “pipeline is primed”, the data flows at full speed. As long as the ACK packets are returned at a regular interval often enough that retransmissions do not occur, the flow will be constant and the only delay is really only during the initial start of the transfer.

However, with interactive apps, that initial delay is what can really kill you. While it’s overkill, let’s say you have a latency of 1 second and sending a packet takes 1 second. If you send a 10 packet file, the total connection time is 11 seconds. If you are sending a single packet and waiting for a single packet response, and you do it twice, your total connection time will be 8 seconds but you still only sent 40% of the traffic.

Web traffic is in between the two. Usually it is not a large transfer, but it is not as interactive as an online game. Typical page traffic is short bursts of requests (high latency) followed by longer periods of inactivity while viewing the page. There are a few tricks that can be done to help reduce this problem. There are proxy servers and pre-recovery utilities that will “preload” the page for you. During that time when you are looking at the page and your connection is inactive, the previous browser can download pages to which the current one is linked. When you order one, hopefully the page has been cached and can be displayed much faster. Otherwise, it will be no worse than having to wait for it to load. This can work well for more static pages, but if you are looking for something for dynamic pages (eg Google Maps), a pre-search engine does not work that well or at all. Also, checking if your browser is using the right number of connections can make things better.

The bottom line is that there is a relationship between bandwidth and latency. But it may or may not be a problem.

Leave a Reply

Your email address will not be published. Required fields are marked *