-2

I like the luxury. I wanted to invent a way for you to obtain the number of digits the unsigned long long int maximum possess. All that automatically. Here it is:

#include <limits.h>
#include <string.h>

#define STRINGIFY(x) #x
#define ULLONG_MAX_STRING STRINGIFY(ULLONG_MAX)
#define NUMERAL_MAXIMUM strlen(ULLONG_MAX_STRING)

Does this work as described?


Now about the strange behavior that pretty much answers the question from above. If I declare a variable like so (-std=c99 specific):

char variable [NUMERAL_MAXIMUM];

instead of declaring automatic-scoped variable, array with the size of 20, it terminates the program before it even reaches that line. Though if I don´t declare the variable like so, nothing terminates and the program continues to work.


What is going on?


Update: Even more strange is that the program does that only if I use this obtained length as a size of an array.

Deduplicator
  • 44,692
  • 7
  • 66
  • 118
Imobilis
  • 1,475
  • 8
  • 29
  • 1
    It won't work as the "stringification" will make a string of the symbol `ULLONG_MAX`, i.e. you get the string `"ULLONG_MAX"`. – Some programmer dude Mar 18 '15 at 15:29
  • Yes, that´s right it returns 10 as the length of `ULLONG_MAX`. The problematic behavior is the real question though. But sorry for that I don´t have a useful code to provide you. – Imobilis Mar 18 '15 at 15:32
  • you are asking us to debug a program without showing the source code or showing the error message(s). We are not wizards – pm100 Mar 18 '15 at 15:37
  • 2
    I don´t ask for debug. Not indirectly. Exactly because you are not wizards. If there is a real need for the entire code (which I doubt).. I am sorry but the code is private.However, I can show it only against private help as in team viewer. Also.. there are no error message(s). I wouldn´t miss to point them out. – Imobilis Mar 18 '15 at 15:40
  • 2
    Can you pick a more descriptive title please? – Pascal Cuoq Mar 18 '15 at 16:37

4 Answers4

0

When you work with defines, you have to be aware that it is preprocessed before compilation begins.

This means, that it can become pretty hard to debug, since the code was only pasted there and the compiler didn't check for semantics, type error, etc.

Try printing ULLONG_MAX and its strlen. CHeck if none of those have weird values when you use your STRINGIFY.

gibertoni
  • 1,368
  • 12
  • 21
  • Their values are appropriate. I can show you for like 1 minute in team viewer how it does that if I comment-out this declaration. – Imobilis Mar 18 '15 at 15:37
  • @Malina I tried your code on my PC, running VS 2012 compiler. It did not allow me to allocate an array using your define with a strlen. On my compiler config it demands a constant and not a variable of any sort. Maybe that is a undefinied behavior – gibertoni Mar 18 '15 at 16:02
  • I am with a mingw compiler. Also variable-sized arrays are `c99-specific`. – Imobilis Mar 18 '15 at 16:20
  • @Malina VLA are only allowed inside functions. Your question does not show `variable` as being declared inside any function. See http://sscce.org – Pascal Cuoq Mar 18 '15 at 16:39
0

IIUC, the OQ wants the size needed to print the maximum values in decimal. The below works for sizes of 8 bits upto 64 bits, at least for CHAR_BIT==8 :

#include <stdio.h>
#include <limits.h>

#define ASCII_SIZE(s) ((3+(s) * CHAR_BIT) *4 / 13)

int main(void)
{
printf("unsigned char : %zu\n", ASCII_SIZE(sizeof(unsigned char)) );
printf("unsigned short int : %zu\n", ASCII_SIZE(sizeof(unsigned short int)) );
printf("unsigned int : %zu\n", ASCII_SIZE(sizeof(unsigned int)) );
printf("unsigned long : %zu\n", ASCII_SIZE(sizeof(unsigned long)) );
printf("unsigned long long : %zu\n", ASCII_SIZE(sizeof(unsigned long long)) );

printf("          Ding : %u\n", UINT_MAX );
printf("     Lang Ding : %lu\n", ULONG_MAX );
printf("Heel lang Ding : %llu\n", ULLONG_MAX );

return 0;
}

Output:

unsigned char : 3
unsigned short int : 5
unsigned int : 10
unsigned long : 20
unsigned long long : 20
          Ding : 4294967295
     Lang Ding : 18446744073709551615
Heel lang Ding : 18446744073709551615
wildplasser
  • 43,142
  • 8
  • 66
  • 109
  • This is useful and related and helps and it is cool. I wouldn´t skip to approve it as an answer. – Imobilis Mar 18 '15 at 20:20
  • Thank you. Maybe you should change the title of the question to something more descriptive ? – wildplasser Mar 18 '15 at 20:26
  • Ahh.. maybe it is too late. – Imobilis Mar 18 '15 at 21:26
  • Note: `((3+(s) * CHAR_BIT) *4 / 13)` works well enough for size 1,2,4,8 and `CHAR_BIT==8`. The key is to use a ratio above log10(2). Without using 3+ digit integers, found that `((s) * CHAR_BIT * 28 / 93 + 1)` to be optimal. http://stackoverflow.com/q/18708679/2410359 – chux - Reinstate Monica Apr 11 '15 at 16:31
0

I don't know what causes your crash. To investigate that I'd first want to look at the preprocessor output (cc -E)

But more importantly, your whole approach is wrong. The length of the preprocessor expansion of ULLONG_MAX is not guaranteed to have any relationship to the length of that number formatted as a string of decimal digits. The first system I checked had this:

#define ULLONG_MAX (LLONG_MAX * 2ULL + 1)

which fully expanded becomes this:

(9223372036854775807LL * 2ULL + 1)

and if you managed to get the strlen of that, it would be quite a bit larger than the length of 18446744073709551615 which is the same number, represented in the way you were probably thinking of.

That means your buffer might be too large. But it also might be too small. Think about the possibility of #define ULLONG_MAX 0xffffffffffffffff.

You need to calculate your buffer size using numeric operations, not string operations. The number of digits in the decimal representation of a positive integer n is ceil(log10(n))+1. You can't do that calculation exactly in the preprocessor because it doesn't do floating point. But you can do a reasonable approximation.

The number of digits in the base 2 representation of ULLONG_MAX is the number of bits in the type: sizeof(unsigned long long)*CHAR_BIT. The number of decimal digits is roughly the number of base 2 digits times log(2)/log(10). And now wildplasser has just posted an answer so I'll stop here and point out that log(2)/log(10) is slightly under 4/13 which is why wildplasser's answer works.

0

You have to expand the argument of STRINGIFY once more to have the value of the constant

#define STRINGIFY_(x) #x
#define STRINGIFY(x) STRINGIFY_(x)

then here you should use the sizeof operator

#define ULLONG_MAX_STRING STRINGIFY(ULLONG_MAX)
#define NUMERAL_MAXIMUM (sizeof(ULLONG_MAX_STRING)-1)

which does everything at compile time.

Jens Gustedt
  • 76,821
  • 6
  • 102
  • 177
  • Thanks. This was the fix I was thinking of. But the real question was the strange behavior. I guess I will drop it. – Imobilis Mar 18 '15 at 16:56