I'm trying to insert some data without duplicates, possibly from 2+ concurrent processes.
Unfortunately I cannot use unique constraint in this case due to the database design (deleted rows are marked with deleted=1
and those can exist as duplicates).
It seems that a simple transaction wouldn't work - the best I could come up with is SELECT ... FOR UPDATE
, but that's not enough - if no rows exist yet, none will be locked, so it doesn't prevent insertion. On the other hand I'd like to avoid locking the whole table for writing.
Is there some nice ways to work around this problem? The table engine is InnoDB. (secondary question is - how to make it work in sqlalchemy, but I can translate the solution if it works in general)
Edit: you can assume schema:
deleted tinyint(1) default null,
id int(11) not null auto_increment,
address varchar(255) default null,
...
where address is supposed to be unique for entries where deleted == 0
.