20

Considering the following line:

char *p = malloc( sizeof(char) * ( len + 1 ) );

Why is sizeof(char) used? It's not necessary, is it? Or Is it just a matter of style?

What advantages does it have?

Neuron
  • 5,141
  • 5
  • 38
  • 59
Nyan
  • 2,360
  • 3
  • 25
  • 38
  • 1
    Duplicate of http://stackoverflow.com/questions/1011806/is-it-necessary-to-multiply-by-sizeof-char-when-manipulating-memory though I disagree with the accepted answer there (I prefer to omit sizeof(char)) – user85509 Jun 25 '10 at 04:40

3 Answers3

26

Yes, it's a matter of style, because you'd expect sizeof(char) to always be one.

On the other hand, it's very much an idiom to use sizeof(foo) when doing a malloc, and most importantly it makes the code self documenting.

Also better for maintenance, perhaps. If you were switching from char to wchar, you'd switch to

wchar *p = malloc( sizeof(wchar) * ( len + 1 ) );

without much thought. Whereas converting the statement char *p = malloc( len + 1 ); would require more thought. It's all about reducing mental overhead.

And as @Nyan suggests in a comment, you could also do

type *p = malloc( sizeof(*p) * ( len + 1 ) );

for zero-terminated strings and

type *p = malloc( sizeof(*p) * len ) );

for ordinary buffers.

brainjam
  • 18,863
  • 8
  • 57
  • 82
  • Then why not use type *p = malloc( sizeof(*type) * (len+1) ); ;) – Nyan Jun 25 '10 at 04:52
  • @Nyan: Sure, that works too. I've never seen this before, but if I were still a working C programmer I would consider adopting your suggestion as an idiom .. although the `len +1` part would only be for string-like objects. – brainjam Jun 25 '10 at 04:59
  • 4
    Not only would you expect it to be 1 it's defined to be so. One of the very few items whose size is explicitly defined. – JaredPar Jun 25 '10 at 04:59
2

It serves to self-document the operation. The language defines a char to be exactly one byte. It doesn't specify how many bits are in that byte as some machines have 8, 12, 16, 19, or 30 bit minimum addressable units (or more). But a char is always one byte.

Amardeep AC9MF
  • 18,464
  • 5
  • 40
  • 50
2

The specification dictates that chars are 1-byte, so it is strictly optional. I personally always include the sizeof for consistency purposes, but it doesn't matter

Michael Mrozek
  • 169,610
  • 28
  • 168
  • 175