0

EDIT: After I saw this thread I removed the multi and chunksize parameters. But now I get the error The "ProgrammingError: (pyodbc.ProgrammingError) ('Unknown object type list during describe', 'HY000')" Is it because in my dataframe there are lists with tuples included?

I want to append all the rows from one dataframe (about 5000 rows and 12 columns) into a mysql server.

My code looks like this:

import sqlite3
import pyodbc
from sqlalchemy import create_engine, types

# connect to SQL database and open cursor
connection = pyodbc.connect("Driver={ODBC Driver 17 for SQL Server};"
                           "Server=DEBXT-MSDB-03\STUDENTS;"
                           "Database=Studentsdb;"
                           "Trusted_Connection=yes;")

# write your logic and sql statements

engine = create_engine('mssql+pyodbc://@' + 'censored' + '/' + 'censored' + '?trusted_connection=yes&driver=ODBC Driver 17 for SQL Server', fast_executemany=True)

dtypes = {
    'Body': types.Text,
    'sender': types.String(length=50),
    'received': types.DateTime,
    'account': types.String(length=20),
    'type': types.String(length=10),
    'email': types.Text,
    'BodySA': types.Text,
    'BOW': types.Text,
    'topic_distribution': types.String(length=200),
    'topic': types.String(length=50),
    'mapped_topic': types.String(length=50)
}

df.to_sql('guided_lda', con=engine, if_exists='replace', index=False, dtype=dtypes, method ='multi', chunksize = 150)

# commit the data to database and close connection and cursor
connection.commit()

connection.close()

I'm waiting more than 10 minutes and it still is not done. I don't think it should take that long. I added the multi method, the chunksize (2100 / 12 columns) and even added now the fast_executemany=True in the engine statement. How long is too long?

Lehas123
  • 21
  • 5

0 Answers0