I've got a timing card which emits a pulse per second (PPS). The specs for the card state there is a delay of this PPS of
approximately ~7ns for each meter of the cable, or ~200ns per 30 meter cable.
What I don't understand is why this delay is dependent on cable length. Shouldn't the speed of the signal through the cable be constant (assuming constant cable properties throughout) and thus no delay based on length? Where does the root cause of the delay come from?
If this is the incorrect stack exchange for this question, please let me know.