I have such a query, which deletes about 16 millions records from table bills
and also deletes records from other four tables in cascade with bill_id foreign key. It takes more than hour on my database.
public int deleteExpiredBills(final Integer someDate, final Set<Integer> itemIdsCanNotBeDeleted) {
return namedParameterJdbcTemplate.update(
"DELETE FROM bills WHERE expired_date::date < CURRENT_DATE - :someDate " +
"AND item_id NOT IN (:itemIdsCanNotBeDeleted)",
new MapSqlParameterSource(Map.of(
"someDate", someDate,
"itemIdsCanNotBeDeleted", itemIdsCanNotBeDeleted)
)
);
}
Moreover, when i call this query (code from above) from my spring boot app I get this exception (but postgres still performs the DELETE after some time):
com.zaxxer.hikari.pool.ProxyConnection - HikariPool-1 - Connection org.postgresql.jdbc.PgConnection@1de77ce0 marked as broken because of SQLSTATE(08006), ErrorCode(0)
org.postgresql.util.PSQLException: An I/O error occurred while sending to the backend.
Is this exception being thrown because my transaction is taking too long?
What is optimal solution in such situations? Should i split this transaction in multiple smaller by using LIMIT n
and executing DELETE in a loop until there are no more records?