2

If I remember correctly, on some machine, int was 16 bit, and when we move onto the 32 bit platform, then int was 32 bit.

Now that Snow Leopard and Lion are 64 bit, can a C or Objective-C program be compiled on Xcode so that int is 64 bit? (and "%d" or "%i" will take a 64 bit integer as well). Or is int kept as 32-bit for compatibility reason?

(and if using 64 bit int, will it be faster than 32 bit because 64 bit is native?)

Update: I just found out sizeof(NSInteger) if printed in a console app by Xcode on Lion is 8 (it is typedef as long), and if it is on iOS 5.1.1, then it is 4 (typedef as int). sizeof(int) is 4 on both platforms. So it looks like in a way, int moved from 16 bit to 32 bit before, but now we do want to stop it at 32 bit.

nonopolarity
  • 146,324
  • 131
  • 460
  • 740
  • 2
    1. you can use `long`; 2. read this for extra information: http://stackoverflow.com/a/199472/188331 – Raptor Jun 07 '12 at 09:00
  • You also might get some good insight [looking at this very related question](http://stackoverflow.com/questions/2985008/how-should-i-declare-a-long-in-objective-c-is-nsinteger-appropriate). – Michael Dautermann Jun 07 '12 at 09:01
  • what you're looking for is the int64_t/uint64_t types; if you want to force a certain number of bits for an integer, you must explicitly use the types which guarantee the size you require. – tbert Jun 07 '12 at 09:10

3 Answers3

3

Under the LP64 data model, as adopted by Apple, an int will always be 32-bits. Some compilers might allow you to use the ILP64 data model instead, as explained in the comments here. However, you will likely break compatibility with precompiled libraries.

Community
  • 1
  • 1
Gil
  • 3,529
  • 1
  • 19
  • 22
1

can a C or Objective-C program be compiled on Xcode so that int is 64 bit?

I have been unable to find a clang option that will make int 64 bits wide. In fact, the platform headers seem to assume that it can't be. Also, you would be unable to use any platform library functions / methods that take an int as a parameter (that includes things like printf() that specify int types in the format string).

and if using 64 bit int, will it be faster than 32 bit because 64 bit is native?

I see no reason why using 32 bit ints would be slower than using 64 bit ints. In fact, possibly, they might be faster since you can fit twice as many of them in the cache.

JeremyP
  • 84,577
  • 15
  • 123
  • 161
  • On 64 bit, 32 bit unsigned ints are definitely slower than 64 bit unsigned long. The reason is that for unsigned 32 bit, 0xffffffff + 1 == 0. As a consequence, it is not guaranteed that for a large array a [i + 1] is the array element after a [i] if i is an unsigned 32 bit integer, and that destroys many optimisations. – gnasher729 May 07 '14 at 14:58
0

To get a compiler where int = 64 bit, you'd have to download the Clang source code and modify it by hand. So with existing gcc or Clang compilers, there's no way.

But there is a good reason for int being 32 bits: Lots of code needs for compatibility reasons the ability to have 8 bit, 16 bit, 32 bit, and 64 bit items. The types available in C are char, short, int, long, and long long. You wouldn't want long or long long being smaller than int. If int = 64 bit, you'd only have two types (char and short) for three sizes (8, 16 and 32 bit), so you'd have to give up one of them.

gnasher729
  • 51,477
  • 5
  • 75
  • 98