1

I have a Huge mysql dump I need to import, I managed to split the 3gig file by table insert, one of the table inserts is 600MBs, I want to split it into 100 MB files. So my question is: is there a script or easy way to split a 600MB INSERT statement into multiple 100MB inserts without having to open the file (as this kills my pc).

I tried SQLDumpSplitter but this does not help.

here is the reason I cannot just run the 600MB file:

MYSQL import response 'killed'

Please help

Community
  • 1
  • 1
David
  • 3,927
  • 6
  • 30
  • 48
  • Check this solution for Windows/linux: http://stackoverflow.com/questions/132902/how-do-i-split-the-output-from-mysqldump-into-smaller-files/30988416#30988416 – Alisa Jun 22 '15 at 22:03

2 Answers2

2

On Linux, easiest way to split files is split -l N - split to pieces N lines each.

On Windows, I've had pretty good luck with HxD - it works well with huge files.

mvp
  • 111,019
  • 13
  • 122
  • 148
1

You can easily open a file of 1GB on Textpad software. User this software to open the file and split your queries as what you want.

Link for downloading TextPad software TextPad

Saharsh Shah
  • 28,687
  • 8
  • 48
  • 83