1

I am trying to import the Musicbrainz db into a MySQL database using Perl scripts from Churruka, not the "official" ones due to not working properly (I consulted here).

Now I have inserted all the tables except 'recording', whose size is 1.39 GB. I have tried to insert the downloaded file two times. However, the insertions lasts too much and I'm afraid some configuration is not correctly set in MySQL. I just downloaded it for this import, so no more databases are present.

Both times, something happened to my laptop so I had to restart the process. I deleted all the rows from the 'recording' table to restart again, but the process lasts sooooo many seconds. Currently is above 110000s !!!

Any idea of how this process could be speeded up? Thanks in advance

aloplop85
  • 892
  • 3
  • 16
  • 40
  • 1
    You can enable slow query log in your my.cnf with `slow-query-log = 1 slow-query-log-file = /var/log/mysql/mysql-slow.log` and check where is the bottleneck – Absalón Valdés Aug 01 '15 at 21:11
  • The log was already activated but I cannot see anything regarding the DELETE statement. I think I will restart de service and try to insert the data again... – aloplop85 Aug 02 '15 at 07:12
  • I have used DROP TABLE recording and restarted the INSERT again... – aloplop85 Aug 02 '15 at 07:26
  • 66 hours later, the miracle took place Sun Aug 2 09:25:21 2015: Loading data into 'recording' (1 of 1)... Done (66h 4m 32s) Complete (66h 4m 32s) – aloplop85 Aug 05 '15 at 04:25
  • Oh boy. It took that long for just 1.35 GB! It is the query inside a transaction or it is the script making some expensive transformation in the data before making the insert? – Absalón Valdés Aug 05 '15 at 04:40
  • 1
    Unless you have a specific use case for using MySQL, musicbrainz has a docker container which makes set up of postgres db super easy and convenient - https://github.com/metabrainz/musicbrainz-docker – outlier229 Jan 14 '21 at 09:13

0 Answers0