Home > Mobile >  Difference between speed and latency
Difference between speed and latency

Time:11-24

I think latency refers to execution "speed" when bounded by some time constant (this function cannot take more than X milliseconds to finish execution), but I don't really understand the different between both. Doesn't a faster function have a lower latency? Doesn't lowering the latency increases its speed? Doesn't those concepts imply each other?

I have tried reading definitions of both concepts but haven't really get it yet, so, in order to understand better the difference between both, could you provide a real-world problem where (and why):

  • Trying to increase the speed of a solution increases its latency?
  • Trying to reduce the latency of a solution decreases its speed?

Also, I have the feeling that both concepts are used with slightly different meanings in the world of networking and traditional "execution speed" (in high-frequency trading for example). Is that right?

CodePudding user response:

I understand "latency" to mean "how long before a system starts delivering", whereas I understand "speed" to mean throughput per second. Sometimes you can't improve latency - it takes an elephant 18 months to produce a baby elephant, adding more mother elephants will allow you to make more baby elephants in 18 months but the first one will still take 18 months.

Real world example where trying to increase the speed of a solution increases its latency

  • You have a racing car and want to make it faster. So you increase the gearbox ratio so that it can go faster for its maximum permissible revs/sec. Unfortunately, that means the car struggles to accelerate and takes longer to get up to speed (worse latency).

Real world example where trying to reduce the latency decreases the speed

  • You want to reduce the latency in responding to requests by adding more parallel workers and sending requests round-robin to different workers. But in so doing each worker that previously had a hot cache no longer does because it didn't deal with the previous request that was nearby in memory so it ends up taking longer (less throughput/speed).

I guess another way of thinking about it is in terms of the units - or "dimensional analysis". I would expect latency to be measured in seconds or milliseconds, whereas I would expect speed to be measured in items/second.

  • Related