0

I tried to read file (650 megabytes) from SQL Server:

using (var reader = command.ExecuteReader(CommandBehavior.SequentialAccess))
{
   if (reader.Read())
   {
       using (var dbStream = reader.GetStream(0))
       {
          if (!reader.IsDBNull(0))
          {
              stream.Position = 0;
              dbStream.CopyTo(stream, 256);
           }

           dbStream.Close();
         }
       }

       reader.Close();
    }

But I got OutOfMemoryException on CopyTo().

With small files, this code snippet works fine. How can I handle large file?

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
Kliver Max
  • 5,107
  • 22
  • 95
  • 148

2 Answers2

3

You can read and write data to some temp file in small chunks. You can see example on MSDN - Retrieving Binary Data.

//Column Index in the result set
const int colIdx = 0;

// Writes the BLOB to a file (*.bmp).  
FileStream stream;                            
// Streams the BLOB to the FileStream object.  
BinaryWriter writer;                          

// Size of the BLOB buffer.  
int bufferSize = 100;                     
// The BLOB byte[] buffer to be filled by GetBytes.  
byte[] outByte = new byte[bufferSize];    
// The bytes returned from GetBytes.  
long retval;                              
// The starting position in the BLOB output.  
long startIndex = 0;                      

// Open the connection and read data into the DataReader.  
connection.Open();  
SqlDataReader reader = command.ExecuteReader(CommandBehavior.SequentialAccess);  

while (reader.Read())  
{  

  // Create a file to hold the output.  
  stream = new FileStream(  
    "some-physical-file-name-to-dump-data.bmp", FileMode.OpenOrCreate, FileAccess.Write);  
  writer = new BinaryWriter(stream);  

  // Reset the starting byte for the new BLOB.  
  startIndex = 0;  

  // Read bytes into outByte[] and retain the number of bytes returned.  
  retval = reader.GetBytes(colIdx, startIndex, outByte, 0, bufferSize);  

  // Continue while there are bytes beyond the size of the buffer.  
  while (retval == bufferSize)  
  {  
    writer.Write(outByte);  
    writer.Flush();  

    // Reposition start index to end of last buffer and fill buffer.  
    startIndex += bufferSize;  
    retval = reader.GetBytes(colIdx, startIndex, outByte, 0, bufferSize);  
  }  

  // Write the remaining buffer.  
  writer.Write(outByte, 0, (int)retval);  
  writer.Flush();  

  // Close the output file.  
  writer.Close();  
  stream.Close();  
}  

// Close the reader and the connection.  
reader.Close();  
connection.Close();

Make sure you are using SqlDataReader with CommandBehavior.SequentialAccess, note this line in above code snippet.

 SqlDataReader reader = command.ExecuteReader(CommandBehavior.SequentialAccess);  

More information on CommandBehavior enum can be found here.

EDIT:

Let me clarify myself. I agreed with @MickyD, cause of the problem is not whether you are using CommandBehavior.SequentialAccess or not, but reading the large file at-once.

I emphasized on this because it is commonly missed by developers, they tend to read files in chunks but without setting CommandBehavior.SequentialAccess they will encounter other problems. Although it is already posted with original question, but highlighted in my answer to give point to any new comers.

DaveShaw
  • 52,123
  • 16
  • 112
  • 141
M_Idrees
  • 2,080
  • 2
  • 23
  • 53
  • 1
    [`Stream.CopyTo()`](https://learn.microsoft.com/en-us/dotnet/api/system.io.stream.copyto?view=netframework-4.7.2) already copies in chunks (in the OP's code it's copying in chunks of only 256 bytes, which seems too small...) – Matthew Watson Aug 30 '18 at 08:29
  • I am confused. @MatthewWatson said that `CopyTo()` is waht i need. But @M_Idress said that i gona rewrite code. WHo is right? – Kliver Max Aug 30 '18 at 08:35
  • @MatthewWatson i tried to use 4096 instead 256 bytes but got same error – Kliver Max Aug 30 '18 at 08:37
  • CommandBehavior.SequentialAccess is important factor here. Rather than loading the entire row, SequentialAccess enables the DataReader to load data as a stream. – M_Idrees Aug 30 '18 at 08:39
  • 1
    But the OP is already using `CommandBehavior.SequentialAccess` – Matthew Watson Aug 30 '18 at 08:48
  • As per the documentation on MSDN mentioned in link... You can use the GetBytes or GetChars method to specify a byte location to start the read operation, and a limited buffer size for the data being returned... – M_Idrees Aug 30 '18 at 08:59
  • OP is already performing `CommandBehavior.SequentialAccess` and his buffer is tiny. They are **not** the cause of the issue –  Aug 31 '18 at 00:27
0

@MatthewWatson yeah var stream = new MemoreStream(); What is not right with it? – Kliver Max 15 hours ago

Your problem is not whether or not you are using:

`command.ExecuteReader(CommandBehavior.SequentialAccess)` 

...which you are as we can see; or that your stream copy buffer size is too big (it's actually tiny) but rather that you are using MemoryStream as you indicated in the comments above. More than likely you are loading in the 650MB file twice, once from SQL and another to be stored in the MemoryStream thus leading to your OutOfMemoryException.

Though the solution is to instead write to a file stream, the cause of the problem wasn't highlighted in the accepted answer. Unless you know the cause of a problem, you won't learn to avoid such issues in the future.