1

I am trying to load a simple pandas DataFrame on a PostgreSQL table (pandas 1.4, Postgres 13):

df = pd.DataFrame([{"index": 1, "properties": {"a": 1, "b": 2}}])
df.to_sql("_test_table", con, if_exists="replace")

However, I get ProgrammingError: can't adapt type 'dict'.

I have seen in other Stack Overflow answers that applying json.dumps fixes the issue, and they are right. However, I would like to know if there's a way to leverage PostgreSQL JSON types instead of converting the information to a string.

The psycopg2 documentation mentions "JSON Adaptation", but running psycopg2.extensions.register_adapter(dict, psycopg2.extras.Json) at the beginning ends up, again, storing the data as text.

There are a couple of mentions of JSON on the psycopg2 FAQ, but I don't think they answer my question.

astrojuanlu
  • 6,744
  • 8
  • 45
  • 105

1 Answers1

0

This is essentially a duplicate of Writing JSON column to Postgres using Pandas .to_sql: the solution is to use the dtype parameter of .to_sql with sqlalchemy.types.JSON:

import sqlalchemy


df = pd.DataFrame([{"index": 1, "properties": {"a": 1, "b": 2}}])
df.to_sql("_test_table", dwh_con, if_exists="replace", dtype={"properties": sqlalchemy.types.JSON})

And now everything works.

astrojuanlu
  • 6,744
  • 8
  • 45
  • 105