2

I want a simple, light-weight way for two basic 8-bit MCUs to talk to each other over an 8-bit UART connection, sending both ASCII characters as 8-bit values, and binary data as 8-bit values.

I would rather not re-invent the wheel, so I'm wondering if some ASCII implementation would work, using ASCII control characters in some standard way.

The problem: either I'm not understanding it correctly, or it's not capable of doing what I want.

The Wikipedia page on control characters says a packet could be sent like this:

  1. < DLE > < SOH > - data link escape and start of heading
  2. Heading data
  3. < DLE > < STX > - data link escape and start of text
  4. < payload >
  5. < DLE > < ETX > - data link escape and end of text

But what if the payload is binary data containing two consecutive bytes equivalent to DLE and ETX? how should those bytes be escaped?

The link may be broken and re-established, so a receiving MCU should be able to start receiving mid-packet, and have a simple way of telling when the next packet has begun, so it can ignore data until the end of that partial packet.

Error checking will happen at a higher level to ensure that a received packet is valid - unless ASCII standads can solve this too

Jodes
  • 14,118
  • 26
  • 97
  • 156
  • 1
    Encoding the length of the data as the first element could take care of that. – 0xFF Sep 21 '16 at 16:56
  • 2
    If you have a convention that every control character needs to be escaped by , you can simply add the literal to the data in either payload or header by *duplicating* it . Other control characters don't need to be escaped by anything, because they will only be interpreted as controls if preceded by and literal payload if not. – tofro Sep 21 '16 at 17:10
  • If you want to be able to synchronise to the framing, simply wait for , then – tofro Sep 21 '16 at 17:12
  • 1
    Look at `uuencode` and `uudecode` for transmission of 8-bit binary data over a 7-bit ASCII medium. It's the traditional UNIX way of sending binary data in text-only e-mail and USENET postings. Another method of sending binary or "special" chars is to use an "escape" character to prefix "special" char and normalizing the byte to a valid ASCII alphanumeric char. – sawdust Sep 21 '16 at 21:39
  • 1
    @tofro *"If you want to be able to synchronise to the framing, simply wait for , then "* -- There's no need to wait for the end of the message frame. Simply begin hunting for the start of message frame. Otherwise you might needlessly toss that first (valid) message. Of course every message has to be validated in case of false message alignment. E.G. if the channel has only 50% utilization, then when you begin reading, you're just as likely to see an idle channel as be in the message. So you always want to hunt for a start rather than first wait for an end. – sawdust Sep 21 '16 at 22:05

2 Answers2

0

Since you are going to transfer binary data along with text messages, you indeed would have to make sure the receiver won't confuse control bytes with payload contents. One way to do that is to encode the payload data so that none of the special characters appear on the output. If the overhead is not a problem, then a simplest encoding like Base16 should be enough. Otherwise, you may want to take a look at escapeless encodings that have been specifically designed to remove certain characters from encoded data.

Ivan Kosarev
  • 304
  • 1
  • 7
0

I understand this is an old question but I thought I should suggest Serial Line Internet Protocol (SLIP) which is defined in RFC 1055. It is a very simple protocol.

JMercer
  • 352
  • 2
  • 9