1

I'm writing a Python script to move data from production db to dev db. I'm using vertica-python (something very similar to pyodbc) for db connection and airflow for scheduling.

The script is divided into two files, one for DAG and one for the actual migration job. I use try-except-finally block for all SQL execution functions in the migration job:

try:
    # autocommit set to False
    # Execute a SQL script
except DatabaseError:
    # Logging information
    # Rollback
finally:
    # autocommit set to False

You can see that setting autocommit and Rollback needs to access the connection, and executing a SQL script needs to access the cursor. The current solution is to simply create two DB connections in DAG and pass them to the migration script. But I also read from a Stackoverflow post that I should pass only the cursor:

Python, sharing mysql connection in multiple functions - pass connection or cursor?

My question is: Is it possible to only pass the cursor from the DAG to the migration script, and still retain the ability to rollback and setting autocommit?

Nicholas Humphrey
  • 1,220
  • 1
  • 16
  • 33

1 Answers1

1

Yes, you can change the autocommit setting via the Cursor:

>>> import pyodbc
>>> cnxn = pyodbc.connect("DSN=mssqlLocal")
>>> cnxn.autocommit
False
>>> crsr = cnxn.cursor()
>>> crsr.connection.autocommit = True
>>> cnxn.autocommit
True
>>>

pyodbc also provides commit() and rollback() methods on the Cursor object, but be aware that they affect all cursors created by the same connection, i.e., crsr.rollback() is exactly the same as calling cnxn.rollback().

Gord Thompson
  • 116,920
  • 32
  • 215
  • 418