The first test is conducted with the unmodified network link, yielding no surprises. The average times for the payloads are shown below:
Payload size | Response time (ms) |
996B | 1 |
25,1KB | 3 |
227,2KB | 22 |
1% packet loss
Payload size | Response time (ms) |
996B | 31 |
25,1KB | 41 |
227,2KB | 170 |
2% packet loss
Payload size | Response time (ms) |
996B | 42 |
25,1KB | 60 |
227,2KB | 190 |
Discussion
Even a low, random, packet loss greatly increases the time it takes for a browser to request and load a resource over the web. The charts displayed are capped at 100ms, however the test data shows that as packet loss increases, the frequency of requests with abnormal high durations increases alongside with the maximum delay for these requests.
While the average time to download a resource increases only moderately, the range of delays introduced is quite high, reaching well into the magnitude of seconds. As typical web pages consist of several stylesheets, javascript and images all of which need to have finished loading before the complete impression of the web page stands, the probability of at least one of these resources not loading in time is high with these scenarios.
In line with expectations, the time to lead for large resources is impacted most by a moderate packet loss. On the other hand a further increase in packet loss does not lead to a similar deterioration of the load time for large resources (11% increase for double packet loss) as for small resources (33% increase for double packet loss).
Conclusions
As we’ve seen, even moderate packet loss introduces arbitrary delays in the time required to load resources from the web. While it is important to keep resources small in order to avoid fragmentation of the network payloads and delays due to retransmission, combining multiple resources into fewer, larger ones (such as with image spriting) yields quantifiable benefits.
[1] Bazarooma