I’m building an app using Python and the clean architecture principles, with TDD.
Some unit tests require executing some raw SQL queries against an in-memory database.
I am trying to switch from sqlite to postgresql inmemory data, using pytest-postgres.
Problem
- When using sqlite inmemory database, I can both insert and select data.
- When using Postgresql inmemory database, I can only SELECT (raw INSERT fails).
Insert work in sqlite…
s_tb_name = "tb_customer"
ls_cols = ["first_name", "last_name", "email"]
ls_vals = ['("John", "Doe", "john.doe@mail.net")',
'("Jane", "Doe", "jane.doe@mail.net")',
'("Eric", "Dal", "eric.d@home.com")']
s_cols = ', '.join(ls_cols)
s_vals = ', '.join(ls_vals)
session.execute(f"INSERT INTO {s_tb_name} ({s_cols}) VALUES ({s_vals})")
…but fail in Postgres:
E sqlalchemy.exc.ProgrammingError: (psycopg2.errors.UndefinedColumn) column "John" does not exist
E LINE 1: ..., email) VALUES (("John"....
From this psycopg documentation page, I understand this is due to pyscopg2.
It prevents injecting raw dynamic SQL, and it seems I should add this :
tb_sql_id = sql.Identifier(s_tb_name)
cols_sql_id = sql.SQL(' ,').join(map(sql.Identifier, ls_cols))
vals_sql_id = sql.SQL(' ,').join(map(sql.Literal, ls_vals))
psycopg2_query = sql.SQL(f"INSERT INTO {tb_sql_id} ({cols_sql_id}) VALUES ({vals_sql_id})")
but logically, sqlalchemy refuses to execute the psycopg2_query
:
sqlalchemy.exc.ArgumentError: SQL expression object expected, got object of type <class 'psycopg2.sql.SQL'> instead
Question
Is there a way to execute raw dynamic insert queries in Postgres using SQL Alchemy?