0

I struggled quite a while trying to save into MySQL a table containing JSON columns, using SQLAlchemy and pandas' to_sql.

I got this error

sqlalchemy.exc.ProgrammingError: (mysql.connector.errors.ProgrammingError) Failed processing pyformat-parameters; Python 'list' cannot be converted to a MySQL type

when trying to do it the standard way below. I have a pandas dataframe (df) where I have two columns containing JSONs ('videos', and 'newsitems').

import sqlalchemy
connection = sqlalchemy.create_engine(f"mysql+mysqlconnector://{user}:{pw}@{host}/{db}")
table_name = 'TABLENAME'
df.to_sql(table_name, 
          connection, 
          if_exists='append', 
          index=False)
MattiH
  • 554
  • 5
  • 9

1 Answers1

0

Now I found a response elsewhere, about postgresql - which works as finely with MySQL. Adding the last two lines dtype={...} fixed the issue

import sqlalchemy
connection = sqlalchemy.create_engine(f"mysql+mysqlconnector://{user}:{pw}@{host}/{db}")
table_name = 'TABLENAME'
df.to_sql(table_name, 
          connection, 
          if_exists='append', 
          index=False,
          dtype={'videos': sqlalchemy.types.JSON,
                 'newsitems': sqlalchemy.types.JSON})

The original Q&A that helped me solve the issue was this

Writing JSON column to Postgres using Pandas .to_sql

MattiH
  • 554
  • 5
  • 9