This question is a follow up to Efficient way to transfer many binary files into SQL Server database
I originally asked why using File.ReadAllBytes
was causing rapid memory use and it was concluded using that method put the data on the large object heap which cannot be easily reclaimed during run-time.
My question now is how to avoid that situation?
using (var fs = new FileStream(path, FileMode.Open))
{
using (var ms = new MemoryStream())
{
byte[] buffer = new byte[2048];
int bytesRead;
while ((bytesRead = fs.Read(buffer, 0, buffer.Length)) > 0)
{
ms.Write(buffer, 0, bytesRead);
}
return new CustomFile { FileValue = ms.ToArray() };
}
}
The following code was intended to get around the problem by reading a file in chunks instead of all at once but it seems to have the same problem.