I received the following data by "pinging" berkeley.edu by varying the packet size: For 100 bytes: 24 packets transmitted, 24 packets received, 0.0% packet loss round-trip min/avg/max/stddev = 91.974/94.269/97.487/1.353 ms For 200 bytes: 26 packets transmitted, 26 packets received, 0.0% packet loss round-trip min/avg/max/stddev = 92.730/97.980/119.909/6.525 ms For 300 bytes: 26 packets transmitted, 26 packets received, 0.0% packet loss round-trip min/avg/max/stddev = 92.136/97.066/126.481/6.382 ms
Although I know that the formula for transmission delay is given by L/R, where L is the total data in bits and R is the bandwidth, I was wondering if I could estimate the transmission delay by using the average times given above and also by varying data size?