What determines packet latency?

Status
Not open for further replies.

EngIntoHW

Member
When a radio wave is sent from one point to another, it's sent in the speed of light, and the time it takes it to reach from one point to another isn't dependant on the carrier's frequency.

So, when it's said that new technologies offer lower latency, they only mean that the delay in the transmission by devices along the way is reduced, right?
 
EngIntoHW,

So, when it's said that new technologies offer lower latency, they only mean that the delay in the transmission by devices along the way is reduced, right?

No, the speed of light is not the only thing that delays transmission propagation. The link below is from Wikipedia. Although there are those on this forum and elseware that bad-mouth and disparage Wikipedia, it does give a concise starting point for a lot of subjects.

Ratch

**broken link removed**)
 
Packet latency is largely determined by the comm protocol. Some protocols guarantee error free packets at the expense of indeterminate latency ( IP for example ) while others guarentee low latency at the expense of indeterminate bit error rate ( UDP for example ) In some applications, like streaming video, garneted latency is important, but there is high tolerance for bit errors, as a few errors aren't normally noticeable by the end user. But for Web pages, error free data is important, while latency isn't. It all depends on the type of service and the protocol chosen to match.
 
Last edited:
Any kind of router cannot route a packet until it has received at least the header, and usually it doesn't retransmit a packet before it has received it completely and checked the CRC. Each hop adds some latency. Of course, faster link throughput makes the packet shorter in time, so it can be stored and forwarded earlier.
 
Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…