7

Given I have to dump a lot amount of inserts in a short period managed to dump all records to a file and then loading them by the load data infile sentence of mysql. This was working fine but now I compress a little more the values into a blob field to make less inserts. The problem is that I cant find a way to dump the blob field into the file so when loading data inserts the correct values. I've tried different ways but no happy ending and want to avoid inserting one by one.

Has anyone knows how to do this properly?

Pabloks
  • 1,484
  • 1
  • 14
  • 15
  • How about making the blob a csv-type entity with quotes around fields and commas between fields. Could that work? – Pete Wilson Mar 17 '11 at 20:36
  • Now that I think about it, how can you possibly make anything faster than load data infile from a normal text file? I believe that, even if you get all the data into a blob, that the same number of inserts (more like updates) have to be done at the server. – Pete Wilson Mar 17 '11 at 20:44
  • I've tried that but didn't insert the blob field properly. Inserting the rows using *load data infile* it's supposed to be more performant than doing it one by one. For the moment i'm using inserts like "INSERT INTO _TABLE_ VALUES (),(),(),.." which is better than inserting each one separately. – Pabloks Mar 17 '11 at 21:59

1 Answers1

12

Store the HEX()'d string of your blob data in the file.

For your SQL, use the SET clause. For example:

LOAD DATA INFILE 'path/to/file.txt'
INTO TABLE mytable
(column1, column2, @hexColumn3)
SET column3=UNHEX(@hexColumn3);
Knak
  • 496
  • 3
  • 14
Anonymous
  • 136
  • 1
  • 2