0

I'm trying to understand asynchronous serial data transmission. I know that the transmitting device sends a start bit (e.g. 1) to the receiver to indicate that transmission has begun; then a stop bit (e.g. 0) afterwards to indicate that the transmission has ended.

What I don't understand: how does the receiving device know which bit is the stop bit? The stop bit is surely no different from the other bits of data. The only way I can think of is if the transmitting device stops sending bits for a significant gap, the receiving device would know that no more bits are forthcoming, and the last bit must have been a stop bit. But if that is the case, then why would a stop bit be required at all, rather than the receiving device simply waiting for a bit, and considering the transmission to be ended when the transmitting device doesn't send any more bits?

Lou
  • 2,200
  • 2
  • 33
  • 66

1 Answers1

0

That becomes a question of protocol. start and stop bits only have meaning if the communicating devices agree on that meaning (e.g. a frame consists of a start bit, 8 data bits, and a stop bit). Similarly, how to denote when a particular communication is complete needs to be agreed between the participants (e.g. define one or more frames that denote message termination).So for a particular communication either a full frame is received and the listener keeps listening, a partial frame is received with no subsequent data transmission and the connection can be considered faulted after some duration, or a full frame is received and that frame denotes the end of the exchange.

cmsjr
  • 56,771
  • 11
  • 70
  • 62