I am preparing for an exam in networking.
In one of the previous exams this question was given:
Assume you're sending a packet of length 4000 bit
through a cable of length 1000 km.
The signal in the cable is moving at 200000 km/s.
The signal bandwidth is 10 Mbit/s.
Calculate how much time it would take for the packet to arrive.
If I would have done this with a car, considering road length and car speed, it would probably take 200 seconds. Though I am not sure how to apply the mbit/s and bits in the calculation.
Is this a correct way of doing it?
(10 mbit/s / 4000 bit) * (200000 km/s / 1000 km) = seconds packet needs to arrive