0

As an assignment I am importing data (~500 < x < ~2500 columns) to several tables. Some tables have an acceptable time to import, but other tables have completely impossible import times.

Is there any PostgreSQL tuning I could do? Any OpenERP patch? Should I import from app via CLI instead of importing from csv via web client?

1737973
  • 159
  • 18
  • 42
  • The problem is the OpenERP ORM. Each object can have different treatments/checks in their create() method, so according to the create() method of the different objects, the time to import data can be more or less longer. – Quentin THEURET Sep 04 '14 at 06:37
  • As written, there are entities that have import times absolutely out of acceptable bounds. Now I am trying to work a batch importer, because I suspect the web import for this tool might be having an exponential time to complete depending on the size of the subset to import. – 1737973 Sep 04 '14 at 07:54
  • I did a script using the methodologies present in [stackoverflow.com/questions/23520584](http://stackoverflow.com/questions/23520584). But I think it will take more than a day to complete. – 1737973 Sep 04 '14 at 08:46

0 Answers0