I have a big table in SQL Server 2012, it contains about 34 million records. I have a batch program that insert/updates the table every 5 minutes, and I have a web application that reads from the table any time.
Every time the batch program is writing to the table, it is causing the reads to go really slow. I think it has something do to with the isolation level, but I'm not so into SQL Server so I don't know how I could fix it.
The best scenario for me would be to let the insert not block the reads. Is it possible to specify that in the query? Or do I have to change the read queries to pick up "dirty rows"?
The application is not critical dependent on "fresh" data so if I can force the DB to allow dirty reads it would be fine. But I don't know how I can achieve this, or if it's a better solution.