0

I would like to save big arrays of integer (e.g. length = 1000) as bytea in postgres.

Example:

Given an integer (32 bits) array

x = [1]

I could insert the array in different ways in postgres, but which form is best practice (faster/less space in disk)?

INSERT INTO table (column_bytea) VALUES (E'\\000\\000\\000\\001'::bytea); -OCTET
INSERT INTO table (column_bytea) VALUES ('\000\000\000\001'); --ASCII
INSERT INTO table (column_bytea) VALUES (E'\x00000001'); --HEX

Thank you in advance.

Hangon
  • 2,449
  • 7
  • 23
  • 31
  • 3
    The best practise would be to store an integer array in an integer array, not as a blob. –  Nov 09 '16 at 22:28
  • 2
    For optimal performance you'd use the binary protocol with `libpq` to exchange an array of int4 efficiently. – Craig Ringer Nov 10 '16 at 14:12
  • @a_horse_with_no_name thank you. I used integer array instead of blob. But I wonder how come is the occupied space on disk greather when using integer array than when using blob. – Hangon Nov 11 '16 at 20:30
  • @CraigRinger thank you for the help. I used integer array. But I will still take a look at the libpq. – Hangon Nov 11 '16 at 20:30
  • It takes more space because it stores the length information for each array and each entry can not contain a NULL, not just a valid integer value, and there is extra metadata to distinguish that. An SQL NULL is logically different from a NULL in any sane language. – coladict Nov 11 '16 at 20:43

0 Answers0