0

I'm currently attempting to import a ~200Mb CSV file into a table of a Google CloudSQL instance. I'm connected through CloudSQL proxy, and things are going at a bit of a crawl...

I initially attempted this with a direct import through MySQL Workbench but it was extremely slow. (After several minutes, only a few tens of records had been inserted.)

Figuring that Workbench might be trying to run operations/commits for each record (I've no real way of knowing what it's up to) I tried again with the mysql client, as this seems to be the canonical way to run a 'fast' import:

LOAD DATA LOCAL INFILE 'london-listings.csv'
INTO TABLE London.listings
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n';

It's just as slow, peaking at 4KB/sec, and only a handful of writes per second, according to MySQL Workbench...

a screenshot of the performance stats from MySQL workbench - suffice to say, it's very slow!

I'm open to suggestions! (Yes I'm considering just spinning up a local instance of MySQL. Hosting it in CloudSQL was supposed to save time!)

instantiator
  • 114
  • 4
  • Update: it took 32 minutes in the end. I'm still interested in an answer though - as I have plenty more tables to import, and I'd rather not go quite as slowly through all of them! – instantiator Aug 10 '17 at 00:24
  • 3
    It would be faster to upload the csv file to GCS and then use the 'import' feature in Cloud SQL. – Vadim Aug 10 '17 at 00:52

0 Answers0