0

In my book it says that transmission delay=(length of the packet)/(transmission speed). However in all of the study problems they do not follow this logic. For example, they ask for the transmission delay of a 1,000 byte packet over a 1Mbps connection. I get 1 microsecond, but somehow they get 8. Am I missing something?

user1205853
  • 651
  • 2
  • 8
  • 14

1 Answers1

3

Because a byte is not a bit.

Oliver Charlesworth
  • 267,707
  • 33
  • 569
  • 680
  • I feel so dumb. For some reason I was under the impression that Mbps stood for Megabytes per second and not Megabits per second. Makes perfect sense now. Of course they failed to mention that in the book. – user1205853 Feb 29 '12 at 00:16