In my book it says that transmission delay=(length of the packet)/(transmission speed). However in all of the study problems they do not follow this logic. For example, they ask for the transmission delay of a 1,000 byte packet over a 1Mbps connection. I get 1 microsecond, but somehow they get 8. Am I missing something?
Asked
Active
Viewed 4,793 times
1 Answers
3
Because a byte is not a bit.

Oliver Charlesworth
- 267,707
- 33
- 569
- 680
-
I feel so dumb. For some reason I was under the impression that Mbps stood for Megabytes per second and not Megabits per second. Makes perfect sense now. Of course they failed to mention that in the book. – user1205853 Feb 29 '12 at 00:16