I am trying to create a 64-bit id in Django (3.1), somewhat like this stackoverflow answer, and store it in Postgres (13.1).
For reference, here is their proposed Django model:
class MyClass(models.Model):
id = models.BigIntegerField(default = fields.make_id, primary_key=True)
with their make_id
modified to return a 64-bit integer with the highest bit set:
def make_id():
return 13528776667777991352
If I use this proposal, I get an error when trying instance.save()
:
django.db.utils.DataError: bigint out of range
That seems correct to me. That number has the highest bit set:
$ python -c "print(f'{13528776667777991352:>64b}')"
1011101110111111110111101011001101101000100000000000001010111000
It will be too large for a signed 64-bit int, but not too large for unsigned:
$ python -c "print(13528776667777991352 > (2**63-1))"
True
$ python -c "print(13528776667777991352 < 2**64)"
True
Postgres docs say the range of bigint
is -2**63
to 2**63-1
. (The numbers in the docs are -9223372036854775808 to +9223372036854775807.)
I am surprised that no one commented on that stackoverflow answer. Since time is shifted into the highest bit position, I feel like anyone would hit this right away. I may be missing something obvious.
I've thought of a few options.
Write a Django custom field (say "UnsignedBigIntegerField") that slides the number from [
-2**63
,2**63-1
] to [0,2**64
] by adding2**63
when it gets saved, and subtracting that when it gets loaded. A hack to be sure.Write a Django custom field that wraps some other Postgres type, like a byte string or bitfield, or fixed-precision numeric.
How can I use a 64-bit id in Django 3.1 using Postgres 13.1 as a backend?
This will be used as an id, so Postgres lookups should be as fast and efficient as possible.
EDIT: My current solution is to leave the highest bit unset. So, I have a 63-bit id. That fits Postgres's constraint, and is fairly simple. It is annoying that I have to leave a bit on the table.