1

I have a massive (3gigs) mysql 4.1.22 sql file I was handed and asked to reinstate the database on our new production server (running mysql 5), the server is ubuntu 12.

I used a script to split the file into smaller sql files by table insert, this has worked well for me as 2 of the inserts are crashing due to the use of quotations.

now my problem is I come to my largest table insert sql file of 450MB's, I use

mysql --verbose -u ***** -p ******  < ****.sql

it runs through the huge file (takes about 20 minutes) and then I got an error 2006, which apparently is due to max_allowed_packet, so I went to my my.cnf and changed it from the default 16M to 1024M, now the whole thing runs and when it finish's it waits and then says "killed" and thats it.

Is this a timeout or something, if someone can please tell me what limit and where it is I have to increase to resolve this, would be a hg help, thank you.

David
  • 3,927
  • 6
  • 30
  • 48
  • http://stackoverflow.com/questions/13599260/split-huge-mysql-insert-into-multiple-files-suggestions/13599560#13599560 This is how I went about solving the problem. – David Dec 03 '12 at 12:28
  • So your solution was just to split the import into several files? – VH-NZZ May 26 '14 at 09:41

0 Answers0