0

It's my first time in this forum and I wanted to start with this question. For what I know, in C, a char data type is 8 bits long, but when you are using this data type in order to transmit ASCII information, is still 8 bits long but 1 bit is a parity bit, is that right?

And if that's correct, my question is, can you transmit an ASCII char to a receiver including the parity bit? Because if my code is: ..... char x=0b01111000; ..... it is received 'x', but if my code is: .... char x=0b11111000; .... it isn't received 'x', but the parity bit is in 1, and there are 4 '1' in my 'x' data, so I don't get when to use the parity bit or what I'm doing wrong. Thanks in advance for your answers!

.........................

Wyck
  • 10,311
  • 6
  • 39
  • 60
Cblue X
  • 143
  • 6
  • 8
    ASCII is a character encoding, and has nothing to do with storage or transmission of bits. To be clear, no parity information is built into ASCII. That is up to a communication protocol which may or may not include any number of parity bits, start/stop bits, data bits, _etc_. – paddy Nov 01 '22 at 03:39
  • All ASCII codepoints have seven bits. In the default C locale, on a machine with 8-bit bytes, they're typically stored with a most significant bit of zero. The half of the code page with a most significant bit of one is implementation-defined and will often be some operating-system-specific code page. C has no built-in support for checksums or parity checking, and it has no reason to. `char` is a datatype. It's a chunk of bits strung together. If you want error checking, that's on you. – Silvio Mayolo Nov 01 '22 at 03:41

3 Answers3

3

For what I know, in C, a char data type is 8 bits long,

On any modern, general-purpose computer yes, that's pretty certain. But it is not a requirement of C itself, and you can find machines in use today where char is larger than 8 bits.

but when you are using this data type in order to transmit ASCII information, is still 8 bits long but 1 bit is a parity bit, is that right?

No. The bits of a char are all part of the data of that char value. They do not get used for other purposes. Most modern transmission media and protocols are 8-bit-clean, meaning that they preserve all 8 bits of each 8-bit unit transmitted.

There used to be media and protocols that were not 8-bit-clean, but about the only way you're likely to run into one of those these days is if you explicitly configure a serial interface for it. And even then, it is not correct to think of it as a data bit being repurposed for something else. And in any case, you do not then manipulate parity bits manually. They are handled transparently to you in hardware.

What's more, your C implementation's default execution character set almost certainly is not ASCII. It is very likely one compatible with ASCII, such as UTF-8-encoded Unicode or one of the ISO-8859 family of encodings, but even that is not a safe assumption in general.

I don't get when to use the parity bit

You don't. And can't. In a context where there are parity bits in the first place, they are not accessible to you.

or what I'm doing wrong.

You're manually encoding parity into your data, and expecting someone other than you to recognize that and care. If you manually encode parity into your data on one end, then you need to manually decode that on the other end. And for the effort to have even the slightest value, you need also to manually test the parity-encoded data upon receipt to catch any parity errors that occur. I don't think I've ever heard of anybody doing that (on a character-by-character basis with character data).

John Bollinger
  • 160,171
  • 8
  • 81
  • 157
2

Parity bits aren't normally visible to application code, they're used on the transport layer and are handled invisibly in the background. For example, RS-232, a common serial standard, can use parity bits but these never show up in the data stream itself, they're only visible if you inspect the actual electrical signals.

ASCII is a 7-bit encoding standard that doesn't include any sort of parity bits. If the upper-most bit is set on a character you've received, it's probably garbled or inadvertently UTF-8.

If you're having transmission errors, you should use something at least as reliable as a CRC to detect them, if not something more robust like SHA2-256 if you're sending larger amounts of data.

tadman
  • 208,517
  • 23
  • 234
  • 262
1

when you are using this data type [char] in order to transmit ASCII information

The C standard does not cover data transmission. The details of how to transmit data fall under the specifications for the transmission protocol (whichever one you are using).

Common protocols are, well, common, so there are often libraries (or perhaps hardware/firmware) that can take care of the messy details. If you use such a library for your transmission protocol, you simply provide the data in whatever format the library requires, and the library will worry about the parity bits.

If there is no such library available, you'll have to consult the protocol specs for what is expected. Each character could be transmitted as 7 bits plus a parity bit (unlikely), or maybe as 8 bits plus a parity bit, or even as something else. For unreliable transmissions, a single parity bit might not be enough, so it comes down to the sender and receiver having the same expectations.

JaMiT
  • 14,422
  • 4
  • 15
  • 31