I have a simple table which is an email queue.
CREATE TABLE `emails_queue_batch` (
`eq_log_id` int(11) NOT NULL DEFAULT '0',
`eq_to` varchar(120) CHARACTER SET utf8 DEFAULT NULL,
`eq_bcc` varchar(80) CHARACTER SET utf8 DEFAULT '',
`eq_from` varchar(80) CHARACTER SET utf8 DEFAULT NULL,
`eq_title` varchar(100) COLLATE utf8_unicode_ci DEFAULT NULL,
`eq_headers` varchar(80) CHARACTER SET utf8 DEFAULT NULL,
`eq_content` longtext CHARACTER SET utf8,
`eq_sid` int(11) DEFAULT '0',
`eq_type` int(11) DEFAULT '0' COMMENT 'email type',
`eq_esp` int(11) DEFAULT '0',
PRIMARY KEY (`eq_log_id`),
KEY `email` (`eq_to`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci
Several threads read repeatedly 50 rows at a time and delete the rows.
To avoid double reading of the same row I used :
$db->query(" LOCK TABLE $table WRITE ");
$query= "SELECT * FROM $table LIMIT ".CHUNK_SIZE. " " ;
$emails2send=$db->get_results ($query);
if (!empty ($emails2send)){
// DELETE EMAIL
$eq_log_ids = array();
foreach ($emails2send as $email) $eq_log_ids[]= $email->eq_log_id ;
$query= "DELETE FROM $table WHERE eq_log_id IN ( ".implode(',', $eq_log_ids)." ) ";
$db->query ($query);
$db->query (" UNLOCK TABLES "); // unlock the table so other sessions can read next rows
........ code processing the read rows here .............
} else { // if !empty emails2send
// $emails2send is empty
$db->query (" UNLOCK TABLES; ");
$stop_running=true; // stop running
}
Another thread (s) write to the table at the same time. For reason I don't understand this configuration got deadlocked with locked table both for reads and for writes.
My question is: Is this locking the right solution to make sure I read each row once and only once (and delete it).
Or is this better handled as a transaction, and if so which kind? I'm not experienced with transactions.