I am using a permission model where I have a table user_permissions
. This table will hold one or more columns with a certain bigint. I will use the bits of each decimal number to compare with certain permission rules (the bit location will be a permission rule and the value will be the condition of the rule active
or not active
).
The problem with this approach is that I have limited number of bits to work when using a number such as bigint.
What is the best column type I can use in this case that works in a cross-database environment?
The tags represent the technologies I am aiming for, so any other solution related to those technologies are appreciated.
I was thinking to use @Lob annotation to store large data, is that the best practice?
UPDATE: The user_permission table extends the user with a 1:1 relationship and have bigint fields like bin_create, bin_read, bin_update, bin_delete that will hold the binary data as decimal numbers.
To clarify the question: I am considering comparing the permissions using bitwise operators. So let's assume I have a user with the permission value 10(1010), and an action requiring 13 (1101). So 10 & 13 == 8 (1000) -> The user have one permission matching the required permissions for the action, so I could allow or deny (it is up to the application rules define which). But with this approach I have a limited number of bits to work on (lets say I increase the permissions to be considered, so the numbers will increase too). The max bigint value is ~9223372036854775807 and that gives me the binary 111111111111111111111111111111111111111111111111111111111111111 with ~60 blocks and permission possibilities per field. So What is the best column type I can use in this case that works in a cross-database environment to store a huge quantity of binary blocks and with the possibility to work with bitwise operators in java?