I'm trying to do a bulk update using core SQLAlchemy to a postgres database. bulk_update_mappings does not work (reports StaleDataError). So I'm trying to use core functions to do a bulk update. This works fine when the update data passed to the values have all the columns in the db but fails to work when we update only a certain columns. In my application, during periodic syncs between the server and the client only a few of the columns will get updated most of the times.
The code snippet I have for update is :
conn = session.connection()
table = table_dict[table_key].__table__
stmt=table.update().where(and_(table.c.user_id==bindparam('uid'),
tbl_pk[table_key]==bindparam('pkey'))).values()
conn.execute(stmt, update_list)
Since I update multiple tables on every sync, table names and primary keys are indexed through an array. For the example below table_dict[table_key] would translate to the table 'nwork' and tbl_pk[table_key] would translate to 'table.c.nwid' which would be 'nwork.nwid'.
The update_list is a list of records (that need to get updated) as a python dictionary. When the record has values for all the columns it works fine and when only some of the column is getting updated it's throwing the following error:
StatementError: (sqlalchemy.exc.InvalidRequestError) A value is required for bind parameter 'last_sync', in parameter group 1
[SQL: u'UPDATE nwork SET type=%(type)s, name=%(name)s, last_sync=%(last_sync)s,
update_time=%(update_time)s, status=%(status)s, total_contacts=%(total_contacts)s,
import_type=%(import_type)s, cur_index=%(cur_index)s WHERE
nwork.user_id = %(uid)s AND nwork.nwid = %(pkey)s']
In this case the error was happening for a record where last_sync was not getting updated.
What's the way of doing a bulk update where the records may not have all the columns (the same set of them) getting updated? I could not find much information from the Update.values() documentation of SQLAlchemy.
I'm running SQLAlchemy 1.0.14.