0

I'm currently importing a large database: 8GB and the largest table has 40 millions row.

The import is taking for hours.

I'm importing using the following command: mysql -u root -p db < db.qsl

When I do: SHOW PROCESSLIST; There is only in query.

Is there any way to be more efficient and to do way more insert in the same time ?

(I'm running on a 16GB ram server and I've now idea what I should provide)

bl0b
  • 926
  • 3
  • 13
  • 30
  • How are you doing the import? – octern Jul 29 '12 at 06:55
  • 1
    If your using mysql dump, memory is usually the biggest bottleneck. But it depends if your doing it locally, or through some sort of program like phpMyAdmin. – Raiden Jul 29 '12 at 06:57

1 Answers1

0

You had better use SQLyog. This is a very efficient software to import data - size is not a concern.

Zoltan Toth
  • 46,981
  • 12
  • 120
  • 134
Ankit Vishwakarma
  • 1,573
  • 16
  • 13