20

I have some bulk data in a text file that I need to import into a MySQL table. The table consists of two fields ..

  1. ID (integer with auto-increment)
  2. Name (varchar)

The text file is a large collection of names with one name per line ...

(example)

John Doe
Alex Smith
Bob Denver

I know how to import a text file via phpMyAdmin however, as far as I understand, I need to import data that has the same number of fields as the target table. Is there a way to import the data from my text file into one field and have the ID field auto-increment automatically?

Thank you in advance for any help.

webworm
  • 10,587
  • 33
  • 120
  • 217

8 Answers8

22

Here is the simplest method to date:

  1. Make sure your file does NOT have a header line with the column names. If it does, remove it.
  2. In phpMyAdmin, as usual: go in the Import tab for your table and select your file. Select CSV as the format. Then -- and this is the important part -- in the

Format-Specific Options:

...in the Column names: fill in the name of the column the data is for, in your case "Name".

This will import the names and auto-increment the id column. You're done!

Tested fine with phpMyAdmin 4.2.7.1.

Fabien Snauwaert
  • 4,995
  • 5
  • 52
  • 70
19

Another method I use that does not require reordering a table's fields (assuming the auto-increment field is the first column) is as follows:

1) Open/import the text file in Excel (or a similar program).

2) Insert a column before the first column. 

3) Set the first cell in this new column with a zero or some other placeholder.

4) Close the file (keeping it in its original text/tab/csv/etc. format).

5) Open the file in a text editor.

6) Delete the placeholder value you entered into the first cell.

7) Close and save the file.

Now you will have a file containing each row of your original file preceded by an empty column, which will be converted into the next relevant auto-increment value upon import via phpMyAdmin.

Sathish D
  • 4,854
  • 31
  • 44
Queue
  • 408
  • 5
  • 7
  • i recommend this suggestion if you want to create an extra column for ID –  Oct 29 '12 at 21:51
  • This is the best answer! Quick and easy! – bukowski Jan 28 '16 at 08:47
  • if you are working with smaller data sets that you can open in excel, this is the way to go. Simply adding an extra column for the primary key works. as a note, I always have a header column when I import data and use 'IGNORE 1 LINES' when importing. Behavior might be different if you don't have a header row – GameChanger May 20 '17 at 00:47
  • how to open csv file in Excel has more than 10 million rows – Rijo Jun 28 '17 at 11:34
  • @Rijo See if you can use Fabien's answer instead. – Queue Aug 17 '17 at 20:07
  • 1
    Does not work: " #1366 - Incorrect integer value: '' for column 'id' at row 1". – bart Sep 04 '18 at 23:23
  • @bart See Fabien's improved answer below. – Queue Sep 13 '18 at 23:09
8

Not correct on import with the LOADTABLE INFILE, just create the auto-increment column as the LAST column/field... As its parsing, if your table is defined with 30 columns, but the text file only has 1 (or anything less), it will import the leading columns first, in direct sequence, so ensure your delimited with... is correct between fields (for any future imports). Again, put the auto-increment AFTER the number of columns you know are being imported.

create table YourMySQLTable
(  FullName varchar(30) not null ,
   SomeOtherFlds varchar(20) not null,
   IDKey int not null AUTO_INCREMENT,
   Primary KEY (IDKey)
);

Notice the IDKey is auto-increment in the last field of the table... regardless of your INPUT stream text file which may have less columns than your final table will actually hold.

Then, import the data via...

LOAD DATA
    INFILE `C:\SomePath\WhereTextFileIs\ActualFile.txt`
    INTO TABLE YourMySQLTable
    COLUMNS TERMINATED BY `","`  
    LINES TERMINATED BY `\r\n` ;

Above example is based on comma seperated list with quotes around each field such as "myfield1","anotherField","LastField". Also, the terminated is the cr/lf that typical text files are delimited per row

In the sample of your text file having the full name as the single column, all the data would get loaded into the "YourMySQLTable" into the FullName column. Since the IDKey is at the END of the list, it will still be auto-increment assigned values from 1-? and not have any conflict with the columns from the inbound text.

DRapp
  • 47,638
  • 12
  • 72
  • 142
  • 6
    I am a afraid I did not quite understand your answer. Could you break it down a little further? I am not sure what you meant by create the auto-increment column as the last file. – webworm Mar 17 '10 at 16:17
  • There is no other way to skip primary key? – Rijo Jun 28 '17 at 11:36
  • @Rijo, correct, all tables should have a primary key. In this case, since it is really just a temporary table for doing bulk inserts, having the LAST COLUMN as the primary key just allows that qualification. Then records can be processed, cleansed and imported to production table and then this table could be cleared for the next bulk load. Who cares which physical location the primary key ID column is placed for temp work anyhow. – DRapp Jun 28 '17 at 13:36
2

I just tried this:

  1. In phpMyAdmin table- match the amount of fields you have in your csv.
  2. Perform the import of csv data into your table
  3. Go to the [Structure] tab and add a new field [At beginning of table] (I assume you want the id field there)
  4. Fill in the [name] attribute as "id",
  5. [length] to "5"
  6. [Index] to "Primary"
  7. Tick the A_I (Auto Increment)
  8. Hit [Go] button
  9. The table should have updated with the id field at the front of all your data with auto-incrementing.

At least this way you don't have to worry about matching fields, etc.

Francisco
  • 10,918
  • 6
  • 34
  • 45
Matty Yeow
  • 41
  • 2
2

I just used a TAB as the first field in my text file, then imported it as usual. I got a warning about the ID field but the field incremented as expected...

1

I´ve solved that problem by simply add the column_names under Format-Specific Options without the Column ID. Because the Column ID ist Auto increment. In my case it works fine without changing anything in the CSV File. My CSV File has only Data inside no Column Headers.

0

If the table columns do not match, I usually add "bogus" fields with empty data where the real data would've been, so, if my table needs "id", "name", "surname", "address", "email" and I have "id", "name", "surname", I change my CSV file to have "id", "name", "surname", "address", "email" but leave the fields that I do not have data for blank.

This results in a CSV file looking like this:

1,John,Doe,,
2,Jane,Doe,,

I find it simpler than the other methods.

Kobus Myburgh
  • 1,114
  • 1
  • 17
  • 46
0

Just wanted to add on to Fabien's answer above the command I use when programmatically importing a CSV file into a table with primary key as auto-increment. I merely specify which columns I'm inserting values to and include those values only in the CSV file. That way I don't have to monkey around with blank fields. LOAD DATA LOCAL INFILE 'files/thefile.csv' INTO TABLE importedStuff FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n' (dateCreated,field1,field2,field5,field10,field12)

Jimbo
  • 499
  • 4
  • 5