0

How do I insert 10-50k+ rows (from e.g. Series or Dataframe) into a Teradata table effectively with Pyodbc?

Background:

  • I am building a GUI app in Python
  • No Teradata libraries can be used (as I fail to package the software with those libs)
dady7749
  • 125
  • 3
  • 11
  • *"No Teradata libraries can be used"* - Does that include "no Teradata ODBC driver"? – Gord Thompson Aug 27 '20 at 13:44
  • Yes, Teradata ODBC driver can be used. – dady7749 Aug 27 '20 at 13:48
  • Okay, and does your app already have a dependency on pandas? – Gord Thompson Aug 27 '20 at 14:04
  • Yes it has panda dependencies – dady7749 Aug 27 '20 at 14:17
  • Okay, then it's not that big of a leap to add in [SQLAlchemy](https://pypi.org/project/SQLAlchemy/) and [teradatasqlalchemy](https://pypi.org/project/teradatasqlalchemy/) and then you can presumably use pandas' `to_sql` method to upload the data. – Gord Thompson Aug 27 '20 at 14:38
  • I have tried exactly that, it runs successfully before packaging. But fails after packaging. Basically to_sql depends on SQLalchemy which in turn depdends on teradatasqlalchemy.. which causes the fail after packaging. – dady7749 Aug 27 '20 at 14:40
  • So you'd rather create an app that requires separate install of a native ODBC driver? Parameterized INSERT with `executemany` is a bit faster than individual INSERT statements, though still not "fast". – Fred Aug 27 '20 at 19:01
  • Fred, haven't tried using different odbc driver. I use mssql for other db connection in app. If that driver would help resolve the issue that would be great! – dady7749 Aug 28 '20 at 08:59

0 Answers0