Not sure what the best way to handle this is. For my particular situation I have numerous tables where I want to delete any rows with a timestamp that is greater than 3 months ago... aka only keep records for the last 3 months.
Very simply it would be something like so :
//set binding cutoff timestamp
$binding = array(
'cutoff_time' => strtotime('-3 months')
);
//##run through all the logs and delete anything before the cutoff time
//app
$stmt = $db->prepare("
DELETE
FROM app_logs
WHERE app_logs.timestamp < :cutoff_time
");
$stmt->execute($binding);
//more tables after this
Every table I am going to be deleting from has a timestamp column which is indexed. I am concerned about down the road when the number of rows to delete is large. What would be the best practice to limit the chunks in a loop? All I can think of is doing an initial select to find if there are any rows which need to be deleted then run the delete if there are... repeat until the initial doesn't find any results. This adds in an additional count query for each iteration of the loop.
What is the standard/recommended practice here?
EDIT:
quick writeup of what I was thinking
//set binding cutoff timestamp
$binding = array(
'cutoff_time' => strtotime('-3 months')
);
//set limit value
$binding2 = array(
'limit' => 1000
);
//##run through all the logs and delete anything before the cutoff time
//get the total count
$stmt = $db->prepare("
SELECT
COUNT(*)
FROM app_logs
WHERE app_logs.timestamp < :cutoff_time
");
$stmt->execute($binding);
//get total results count from above
$found_count = $stmt->fetch(PDO::FETCH_COLUMN, 0);
// loop deletes
$stmt = $db->prepare("
DELETE
FROM app_logs
WHERE app_logs.timestamp < :cutoff_time
LIMIT :limit
");
while($found_count > 0)
{
$stmt->execute( array_merge($binding, $binding2) );
$found_count = $found_count - $binding2['limit'];
}