0

I'm taking jpegs and inserting them into a table as a byte array in a varbinary(MAX) data type field. However, the mdf file is growing three to four times larger in size the the total size of all of the files I'm inserting. I'm using a standard c# coding technique to take a webresponse and convert it into a memorystream:

byte[] result;
byte[] buffer = new byte[4096];

using (Stream responseStream = request.GetResponse().GetResponseStream())
{
    using (MemoryStream memoryStream = new MemoryStream())
    {
        int count = 0;
        do
        {
            count = responseStream.Read(buffer, 0, buffer.Length);
            memoryStream.Write(buffer, 0, count);

        } while (count != 0);

        result = memoryStream.ToArray();

    }
}

And yet somehow 512mb of jpegs ends up growing the mdf over 2gb in size. Where I do the insert into the table I am defining the length on this field as well using result.length. Auto grow is set to 5%.

abatishchev
  • 98,240
  • 88
  • 296
  • 433
jdf35w
  • 29
  • 2

1 Answers1

0

A 512 MB jpeg should take just a bit over 512 mb in the database. The mdf itself may grow bigger, but that is subject to the database configured growth increment. sp_spaceused should show the used vs. free space.

But the way you're processing the file is actually quite bad, using a memory stream. Is never going to work. Read Download and Upload images from SQL Server via ASP.Net MVC for the correct way of streaming files from HTTP into the database. You must use UPDATE blob.write because of the app process memory size constraints and also because of database minimally logged operations requirements. The article linked explains in more details why and how to do it. you can use the same technique also to stream in a web response, w/o creating a full copy in memory.

Remus Rusanu
  • 288,378
  • 40
  • 442
  • 569
  • Maybe I wasn't clear, it's not 1 file, but hundreds of thousands. I'm downloading them via a webrequest and then inserting them into the DB, but as I said the mdf grows far too large and I don't understand why. – jdf35w Sep 24 '14 at 20:31
  • How big is each file, in average? – Remus Rusanu Sep 24 '14 at 20:32
  • about 25k. Somehow they are getting "padded" – jdf35w Sep 24 '14 at 20:55
  • No padding can account for 4:1 ratio. Can you run `sp_spaceused` and `sp_spaceused ` and post the results in your OP? – Remus Rusanu Sep 24 '14 at 21:00
  • Unfortunately, I just shrank it after blowing away a bunch of records and am reloading it now. It will take a few hours! – jdf35w Sep 24 '14 at 21:03
  • How do you save the jpeg stream? You sure no conversion to nvarchar, no base64 encoding or anything similar occurs anywhere, right? – Remus Rusanu Sep 24 '14 at 21:06