For what I know, in C, a char data type is 8 bits long,
On any modern, general-purpose computer yes, that's pretty certain. But it is not a requirement of C itself, and you can find machines in use today where char
is larger than 8 bits.
but when you are using this data type in order to transmit ASCII information, is still 8 bits long but 1 bit is a parity bit, is that right?
No. The bits of a char
are all part of the data of that char
value. They do not get used for other purposes. Most modern transmission media and protocols are 8-bit-clean, meaning that they preserve all 8 bits of each 8-bit unit transmitted.
There used to be media and protocols that were not 8-bit-clean, but about the only way you're likely to run into one of those these days is if you explicitly configure a serial interface for it. And even then, it is not correct to think of it as a data bit being repurposed for something else. And in any case, you do not then manipulate parity bits manually. They are handled transparently to you in hardware.
What's more, your C implementation's default execution character set almost certainly is not ASCII. It is very likely one compatible with ASCII, such as UTF-8-encoded Unicode or one of the ISO-8859 family of encodings, but even that is not a safe assumption in general.
I don't get when to use the parity bit
You don't. And can't. In a context where there are parity bits in the first place, they are not accessible to you.
or what I'm doing wrong.
You're manually encoding parity into your data, and expecting someone other than you to recognize that and care. If you manually encode parity into your data on one end, then you need to manually decode that on the other end. And for the effort to have even the slightest value, you need also to manually test the parity-encoded data upon receipt to catch any parity errors that occur. I don't think I've ever heard of anybody doing that (on a character-by-character basis with character data).