0

I am retrieving rows from large table (1.8 GB, 20 milions of records) with DataReader.

The SQL Server (2008 R2) consumes a lot of memory and (sometimes) doesn't survive this query. It is probably holding the whole result in memory and returning the rows from this buffer to client.

The select is quite simple - it just returns all rows from table with simple where condition, date stored in column is smaller than actual date. There are no blobs or strings in columns.

Am I right with my estimation about the cause of memory usage? And what can I do in this situation - I need all rows, the query doesn't have to be fast, but memory efficient.

Thanks

Updated info - select is in stored procedure. Code:

CREATE PROCEDURE [get_current_records]
with recompile
AS
BEGIN

declare @currentDate datetime = getdate()
SELECT 
   [id]
  , name
  , description
  , number
  ,[valid_from]
  ,[valid_to]
  from ui_parcela
  where valid_from < @currentDate and (valid_to is null or valid_to > @currentDate )
END

2 Answers2

0

Its important to know what are you doing with those rows? Are you storing them in the memory , or you can use each row and release them from the memory

So I am suggesting you try using async reader

Izikon
  • 902
  • 11
  • 23
0

It sounds like you need to look at your SQL query if the server is running out of memory. Have you Indexed correctly?

Check the SQL execution plan to see what is expensive.

Science_Fiction
  • 3,403
  • 23
  • 27