I'm trying to understand asynchronous serial data transmission. I know that the transmitting device sends a start bit (e.g. 1) to the receiver to indicate that transmission has begun; then a stop bit (e.g. 0) afterwards to indicate that the transmission has ended.
What I don't understand: how does the receiving device know which bit is the stop bit? The stop bit is surely no different from the other bits of data. The only way I can think of is if the transmitting device stops sending bits for a significant gap, the receiving device would know that no more bits are forthcoming, and the last bit must have been a stop bit. But if that is the case, then why would a stop bit be required at all, rather than the receiving device simply waiting for a bit, and considering the transmission to be ended when the transmitting device doesn't send any more bits?