5

I actually have a list of 100,000 records which I'd like to insert into the MySQL database.

I have tried to insert them with foreach and simple INSERT INTO however it took a lot of time to insert even 100 row. Like 1 second / row.

IS there any method to insert these rows much faster?

Brian Tompsett - 汤莱恩
  • 5,753
  • 72
  • 57
  • 129
Taboo Loco
  • 551
  • 3
  • 9
  • 14
  • Can you do a data import using a tool like phpMyAdmin or the MySQL command line interface or a similar tool (e.g. Toad)? – Marc Audet Mar 18 '13 at 10:36
  • Actually I'd like to add this 100,000 list to an existing list. So importing wouldn't be okay :/ – Taboo Loco Mar 18 '13 at 10:39
  • An import would allow you to do the inserts even with a table containing pre-existing records. Clone your table structure and some records (if a very large table) and experiment a bit. – Marc Audet Mar 18 '13 at 10:41
  • 1
    Possible duplicate of [How to insert 20 million record into MySQL database as fast as possible](https://stackoverflow.com/q/8474926/608639). – jww Mar 22 '19 at 17:26

6 Answers6

15

Using one INSERT statement with multiple rows is faster than one INSERT statement per row. This will reduce calls to the database.

Example:

 INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);
weelee
  • 166
  • 6
5

Make packages:

INSERT INTO `table_name`(`column`) VALUES('value'),VALUES('value1'),VALUES('value2'), ..., VALUES('valuen');

documentation for insert

or export data to csv or other text format and use LOAD DATA, look here: load data by mysql client

mkjasinski
  • 3,115
  • 2
  • 22
  • 21
0

fastest way to load large dataset is to use bulk loading interface. Refer Mysql docs: Bulk loader

divyabharathi
  • 2,187
  • 17
  • 12
0

The fastest way to use command line interface. You can use mysqldump util.

0

Group several similar INSERTs in one long INSERT with multiple VALUES lists to insert several rows at a time: query will be quicker due to fact that connection + sending + parsing a query takes 5-7 times of actual data insertion (depending on row size). If that is not possible, use START TRANSACTION and COMMIT, if your database is InnoDB, otherwise use LOCK TABLES — this benefits performance because the index buffer is flushed to disk only once, after all INSERT statements have completed; in this case unlock your tables each 1000 rows or so to allow other threads access to the table.

From this article: http://www.ajaxline.com/node/2099

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
Curlas
  • 879
  • 5
  • 14
-1

(only for mysql)

update: you can use mysql_random_data_load
plugin to make it

github link click here


This makes a function to return random names. you can change it to fit yourself

CREATE FUNCTION `get_name`() RETURNS varchar(5) CHARSET utf8
BEGIN
    DECLARE chars varchar(255) default '里对非肺乔额及爱我动物地位三次幂动物';
    DECLARE str varchar(255) default '';
    DECLARE i int default 0;
    while i<3 do
        set str=concat(str,substring(chars, FLOOR(1 + RAND()*62), 1));
        set i=i+1;
    END while;
RETURN str;
END

make a procedure to achieve making lots of data

CREATE PROCEDURE `make_data`(size int)
BEGIN
    declare i int default 0;
    while i<size do
        insert table_name(name)value(get_name());
    end while;
END

call make_data(1000)[done]

If you use MySQL Workbench, you need to add function and procedure manually. If you use MySQL console, then you should add DELIMITER like below

DELIMITER $$
$$
CREATE PROCEDURE `make_data`(size int)
BEGIN
    declare i int default 0;
    while i<size+1 do
        insert table_name(name)value(get_name());
    end while;
END$$

DELIMITER ;
tyrantqiao
  • 319
  • 4
  • 7