16

With C99 (and later standards) standard requires certain types to be available in the header <stdint.h>. For exact-width, e.g., int8_t, int16_t, etc..., they are optional and motivated in the standard why that is.

But for the uintptr_t and intptr_t type, they are also optional but I don't see a reason for them being optional instead of required.

Bo R
  • 2,334
  • 1
  • 9
  • 17
  • 3
    Hardware may exist that cannot support them (efficiently). So the standard cannot *mandate* them. For common platforms you can pretty much assume they exist - you'd know if they don't on your chosen platform. – Jesper Juhl Nov 19 '18 at 18:10
  • @JesperJuhl: In my opinion, that's a questionable design decision. Even if a `uintptr_t` would not be efficient, having it is a better solution than not having it. There are problems, which can only be solved with this type (in an implementation-defined manner). So a quality implementation should always give this type. – geza Nov 19 '18 at 18:24
  • @geza There's not that much of a point in having a *portable* type that is used in an *implementation-defined* manner, come to think about it… – Arne Vogel Nov 20 '18 at 06:39
  • 1
    @ArneVogel: it seems a contradiction, and but it isn't. If `uintptr_t` is always available, then programs which use it can be more portable, as you don't have to worry about whether it is available or not. There are problems, where the exact conversion doesn't matter. For example, xor linked list. And I think that a quality implementation should do the expected thing: for example, adding 1 to `TYPE *` should do the same as adding `sizeof(TYPE)` to the converted integer (so the undefined behaviour of expr.add can be circumvented). – geza Nov 20 '18 at 07:37

1 Answers1

16

On some platforms pointer types have much larger size than any integral type. I believe an example of such as platform would be IBM AS/400 with virtual instruction set defining all pointers as 128-bit. A more recent example of such platform would be Elbrus. It uses 128-bit pointers which are HW descriptors rather than normal addresses.

user7860670
  • 35,849
  • 4
  • 58
  • 84
  • 1
    Are there any examples of such platforms? – Bo R Nov 19 '18 at 18:07
  • 2
    @BoR Sure, how about x86 real mode, where addresses are 20 bits long but the machine word size is 16 bits? – Govind Parmar Nov 19 '18 at 18:12
  • 1
    @GovindParmar, just because the machine word size of x86 real mode is only 16 bits does not mean that a C implementation for that platform does not have an integral type that can accommodate all pointer values. Indeed, `long int` and `unsigned long int` are non-optional, and are required to have at least 31 and 32 value bits, respectively. – John Bollinger Nov 19 '18 at 18:16
  • 1
    128 bits pointer??? Why doing such a thing? 2^128 is larger than the number of atoms in the observable universe! – Oliv Nov 19 '18 at 18:40
  • 1
    @Oliv That instruction set was designed to be executed on future computational devices so pointer size was selected to be "be enough for anyone". Also there was some other platform with large pointers, keeping some security descriptors in them instead of plain addresses. Note that even plain C++ typically uses 96 / 192-bit member function pointers. – user7860670 Nov 19 '18 at 18:45
  • 5
    @Oliv: There must be much more. 2^128 is ~10^38. Earth mass is ~6*10^24 kg. If we use with very heavy atoms, for which a mole is 1kg, then Earth has 6*10^24*6*10^23 = 3.6*10^48 atoms. At least, as this is a very pessimistic estimate. This is way higher than 2^128. :) – geza Nov 19 '18 at 18:52
  • @geza Indeed! I did not check the computation. – Oliv Nov 19 '18 at 18:54
  • 3
    there are [about 10^80 estimated atoms in the universe](https://www.wolframalpha.com/input/?i=number+of+atoms+in+the+observable+universe) and that is [3*10^41 times more than 2^128](https://www.wolframalpha.com/input/?i=(+number+of+atoms+in+the+observable+universe)+%2F+2%5E128) :) – bolov Nov 19 '18 at 19:50
  • 4
    This would also be true of an implementation wigh bounded pointers or where pointers are some sort of tuple of object id, offset, and possibly other info enabling a memory-safe implementation. – R.. GitHub STOP HELPING ICE Nov 20 '18 at 02:31
  • What is the point of a 128-bit pointer (no pun intended)? A lowly 64-bit pointer could accommodate up to 18 **exa**bytes of RAM. I do not know of any computer which has even close to that much. – user16217248 Oct 01 '22 at 15:29
  • 1
    @user16217248 It is similar to IPv6 - 128-bit is certainly considerably more than required to store address of any realistic size. However additional bits could be utilized for some auxiliary data, such as access flags, cache residency flags, security token (I think Elbrus does exactly this), etc. – user7860670 Nov 13 '22 at 20:31