1

Here is the context. I've got a ruby on rails application deployed on Heroku. I'm using Postgres database.

I'd like to import a very large csv file (69MB) (it's the allCountries file from geonames) into my heroku database.

How to make it ?

  • i think it's a bad idea to put this large file in my repo and after to make a rake task to import it via activerecord. It's a bad idea according to me because it will slow down pushes to heroku. Do you agree ?

  • with a psql client from my local machine to heroku postgres database like described herre : https://gist.github.com/jboesch/5605747 Are there any restricitions by heroku to import large files like that ? No timeout issue or something like that ?

Do you see a more appropriated way to import my large csv file ?

Thank you in advance for your answers !

Christophe http://www.merciedgar.com

krichtof
  • 83
  • 8
  • 69MB is not very large, using COPY should take just a few seconds. If not within a second... – Frank Heikens Jun 09 '14 at 10:52
  • 1
    I think putting it in your repo (eg in a new folder `db/csv_for_import`) and then importing it with a rake task or a migration is the right way to go. Speed-wise it will only affect the first push - after that git won't see a change for it and so won't push anything relating to the file. – Max Williams Jun 09 '14 at 11:02
  • Thanks Frank and Max for your answers. Ok, so i will choose the first way, by putting my csv file in my repo ! Thanks again – krichtof Jun 10 '14 at 08:02
  • Possible duplicate of [Cassandra .csv import error:batch too large](http://stackoverflow.com/questions/36618142/cassandra-csv-import-errorbatch-too-large) – Paul Sweatte Dec 06 '16 at 16:34

0 Answers0