46

Currently, We have a table similar to this:

---------------------
ID | AField | BField|
---------------------

The ID is Auto Increment

How do I create a CSV that can let the database auto populate the ID field with auto increment number?

We've tried the following CSV but it doesn't work:

afieldvalue, bfieldvalue (With Column definition but still doesn't work)
0,afieldvalue,bfieldvalue 
NULL,afieldvalue,bfieldvalue
pavium
  • 14,808
  • 4
  • 33
  • 50
DucDigital
  • 4,580
  • 9
  • 49
  • 97

7 Answers7

71

The best thing to do is just include the 2 non-auto-increment columns in the CSV, and then explicitly set the ID column to NULL in the load data infile statement.

Something like this:

LOAD DATA INFILE '/tmp/data.csv'
INTO TABLE your_table
FIELDS TERMINATED BY ','
(AField, BField)
SET ID = NULL;
Ike Walker
  • 64,401
  • 14
  • 110
  • 109
  • 1
    Use `(your_table.AField, your_table.BField)` instead if your CSV has the same column names. – Thilo Apr 07 '14 at 17:04
  • @Thilo LOAD DATA INFILE does not read column names from the CSV. That's why you must set the columns to the exact order of the CSV column data. – John C Dec 15 '20 at 15:22
  • Maybe worth noting that if the data has been exported including the `ID` column it can be import with ... `(@UnusedVariable,AField,BField)` – Paul B. Aug 23 '21 at 12:30
7

you can "omit" ID field in data row by placing delimiter sign in the beginning of a row, like this (for table schema ID,AField,BField):

,afieldvalue,bfieldvalue
...

SQL engine will assign NULL to ID field while reading such csv file

Alexey Bakulin
  • 1,229
  • 2
  • 13
  • 15
  • What worked for me was to provide "NULL" or "\N" (note the capital N) in the CSV file for the auto increment field. Providing a blank value resulted in "ErrorCode: 1366 Warning: Incorrect integer value: '' for column \`database\`.\`table\`.\`the_id_column\`". 10.5.16-MariaDB on an InnoDB table. – Neek Oct 20 '22 at 02:55
4
LOAD DATA INFILE '/tmp/data.csv'
INTO TABLE your_table
FIELDS TERMINATED BY ',' LINES TERMINATED BY '\r'
(AField, BField);
TRiG
  • 10,148
  • 7
  • 57
  • 107
2

NULL in CSV file is read as a STRING in MySQL. If you want to have null values in a MySQL column instead use \N in place of `NULL. This will fix the issue and you will get the required result.

Shubhanshu Mishra
  • 6,210
  • 6
  • 21
  • 23
  • None of the other solutions worked - this worked for my ID column that is an INT – leo_cape May 20 '21 at 19:44
  • FYI using the string `NULL` did work for me today, for an `int` auto increment primary key field, but to my surprise the string `\N` also worked, but it has to be uppercase `N`, `\n` did not work. – Neek Oct 20 '22 at 02:59
  • The ```sed``` command to replace emtpy values with ```\N``` ```sed 's/,,/,\\N,/g; s/,,/,\\N,/g; s/,$/,\\N/g;'``` – Borislav Gizdov Jun 06 '23 at 11:38
2

Try to use NULL as the value in the csv file for the ID field.

Kristofer
  • 3,201
  • 23
  • 28
0

If you have only a small number of columns, you can just name them: LOAD DATA LOCAL INFILE 'myfile.txt' INTO TABLE mytable (AField); (assuming only ID & AField)

Rolf
  • 1,129
  • 11
  • 27
-1
LINES TERMINATED BY '\r\n' 

That was the key to success for me when using a text file

Pete - iCalculator
  • 929
  • 1
  • 8
  • 12
  • love it when you get a down vote without a comment lol. I am sure others will also need to terminate the line as I mentioned in order to get the solution to work, if you are going to downvote, explain why so others can comprehend / learn from the wisdom of your insight. – Pete - iCalculator Oct 26 '18 at 07:52
  • 3
    Doesn't seem to explain how to skip the ID field; nor does it explain what it does. What if your lines *aren't* terminated by `\r\n`? – TRiG Jul 12 '19 at 11:39
  • Clearly the question was changed or merged, my answer was in relation to issues with terminating a line to have separate results in the document show line by line, it must have been expanded to include sql info later. – Pete - iCalculator Sep 12 '19 at 11:55
  • 1
    I don't see anything in the edit history, but it is from a while ago, and I think not everything was preserved. – TRiG Sep 12 '19 at 13:20