3

(Edited change C/C++ to C)

Please help me to find out a clean clarification on char and unsigned char in C. Specially when we transfer data between embedded devices and general PCs (The difference between buffer of unsigned char and plain char).

Lundin
  • 195,001
  • 40
  • 254
  • 396
Dig The Code
  • 658
  • 2
  • 15
  • 32
  • 4
    There's no such thing as C/C++. You either use C or C++, but not both. Choose exactly one of the languages which you want to ask about. In C, the difference is in signedness. `char` may be signed or unsigned in an implementation-defined manner, whereas `unsigned char` is, obviously, always unsigned. – user3447428 Mar 25 '14 at 09:07
  • Can I know the difference and effects of char and unsigned char types during an encoding or decoding operation. ? – Dig The Code Mar 25 '14 at 09:16
  • 2
    @user3447428 While they are indeed different languages and it does indeed not make sense to ask how something works in "C/C++", both languages happen to behave in exactly the same manner when it comes to `char`... – Lundin Mar 25 '14 at 09:40
  • 2
    @user3458841 What is an "encoding or decoding operation"? That can mean anything. – Lundin Mar 25 '14 at 09:41
  • @Lundin And it doesn't matter. – user3447428 Mar 25 '14 at 15:41

2 Answers2

10

You're asking about two different languages but, in this respect, the answer is (more or less) the same for both. You really should decide which language you're using though.

Differences:

  • they are distinct types
  • it's implementation-defined whether char is signed or unsigned

Similarities:

  • they are both integer types
  • they are the same size (one byte, at least 8 bits)

If you're simply using them to transfer raw byte values, with no arithmetic, then there's no practical difference.

Mike Seymour
  • 249,747
  • 28
  • 448
  • 644
  • If I consider about 'C' language Can I know the difference and effects of char and unsigned char types during an encoding or decoding operations. ? – Dig The Code Mar 25 '14 at 09:18
  • 1
    The only real difference is that `unsigned char` is not definitely signed, and `char` may or may not be signed. (It is _implementation defined_ whether `char` is signed or unsigned, which means that it can be either, completely at the whim of your compiler writer, but the compiler must document which one it is.) When working with raw data, you should explicitly use `unsigned char`. When working with meaningful data, you should choose a signed type or an unsigned type depending on the meaning of your data. Use plain `char` if the signedness of the data is completely irrelevant. – This isn't my real name Mar 26 '14 at 19:58
3

The type char is special. It is not an unsigned char or a signed char. These are three distinct types (while int and signed int are the same types). A char might have a signed or unsigned representation.

From 3.9.1 Fundamental types

Plain char, signed char, and unsigned char are three distinct types. A char, a signed char, and an unsigned char occupy the same amount of storage and have the same alignment requirements (3.11); that is, they have the same object representation.