Actually, it is the question for an interview of a company which builds high-load service. For example, we have a table with 1TB of records with primary b-tree index. We need to select all records in a range from 5000 to 5000000. We cannot block the whole database. Database in under high load. Does it make sense to split a huge select query into parts like
select * from a where id > =5000 and id < 10000;
select * from a where id >= 10000 and id < 15000;
...
Please help me to compare behaviour in case when we use Postgres and MySQL. Are there any other optimal techniques to select all required records?
Thanks.