0

I have an operation on the site that takes crops an image, however the resultant, cropped image is coming out significantly larger in terms of file size (original is 24k and the cropped image is like 650k). So I found that I need to apply some compression to the image before saving it. I came up with the following:

public static System.Drawing.Image CropImage(System.Drawing.Image image, Rectangle cropRectangle, ImageFormat format)
{
    var croppedImage = new Bitmap(cropRectangle.Width, cropRectangle.Height);
    using (var g = Graphics.FromImage(croppedImage))
    {
        g.InterpolationMode = InterpolationMode.HighQualityBicubic;
        g.DrawImage(
            image, 
            new Rectangle(new Point(0,0), new Size(cropRectangle.Width, cropRectangle.Height)), 
            cropRectangle, 
            GraphicsUnit.Pixel); 
        return CompressImage(croppedImage, format);
    }
}

public static System.Drawing.Image CompressImage(System.Drawing.Image image, ImageFormat imageFormat)
{
    var bmp = new Bitmap(image);
    var codecInfo = EncoderFactory.GetEncoderInfo(imageFormat);
    var encoder = System.Drawing.Imaging.Encoder.Quality;
    var parameters = new EncoderParameters(1);
    var parameter = new EncoderParameter(encoder, 10L);
    parameters.Param[0] = parameter;

    using (var ms = new MemoryStream())
    {
        bmp.Save(ms, codecInfo, parameters);
        var resultImage = System.Drawing.Image.FromStream(ms);
        return resultImage;
    }
}

I set the quality low just to see if there was any change at all. There isn't. The crop is being saved correctly appearance-wise but compression is a no joy. If I bypass CompressImage() altogether, neither the file size nor the image quality appear to be any different.

So, 2 questions. Why is nothing happening? Is there a simpler way to compress the resultant image to "web-optimize" similar to how photoshop saves web images (I thought it just stripped a lot of info out of it to reduce the size).

Sinaesthetic
  • 11,426
  • 28
  • 107
  • 176
  • 1
    What image format are you reading in and saving? BMP compresses hardly. PNG or JPG compress much better. I can easily imaginge giong from a 24k PNG to a 650k BMP. – DasKrümelmonster Feb 06 '14 at 00:55
  • I'm reading in png/jpg. The Bitmap is just an intermediate object – Sinaesthetic Feb 06 '14 at 01:42
  • The problem is not what you're reading... is what are you saving. Plaese check the correct settings for the condecInfo and the paramets object...Maybe you're saving the image with more quality than the original... – HiperiX Feb 06 '14 at 03:37

2 Answers2

3

Your problem is you must 'compress' (really encode) the image as you save it, not before you save it. An Image object in your program is always uncompressed.

By saving to the MemoryStream and reading back out from the stream will encode the image and then decode it back to the same size again (with some quality loss in the process if you are using JPEG). However, if you save it to a file with the compression parameters, you will get a compressed image file.

Using this routine with JPEG quality level 90 on a 153 KB source image gives an output image of 102 KB. If you want a smaller file size (with more encoding artifacts) change the encoder parameter to something smaller than 90.

public static void SaveJpegImage(System.Drawing.Image image, string fileName)
{
    ImageCodecInfo codecInfo = ImageCodecInfo.GetImageEncoders()
        .Where(r => r.CodecName.ToUpperInvariant().Contains("JPEG"))
        .Select(r => r).FirstOrDefault();

    var encoder = System.Drawing.Imaging.Encoder.Quality;
    var parameters = new EncoderParameters(1);
    var parameter = new EncoderParameter(encoder, 90L);
    parameters.Param[0] = parameter;

    using (FileStream fs = new FileStream(fileName, FileMode.Create))
    {
        image.Save(fs, codecInfo, parameters);
    }
}
Doug
  • 281
  • 2
  • 6
  • So, in my particular implementation, the image is never saved to disk. The image is saved as binary info in a database and then read back out directly to a web page (MVC FileResult). How can I compress the image and save the bytes? If I read the compressed bytes out of the database into an Image (intermediate), based on what you say, it would re-inflate the size. – Sinaesthetic Feb 06 '14 at 17:17
  • And this is virtually identical to what I have currently, the only difference is that I read it back into an `Image` before returning it. I tried omitting that part and returning the byte array out of the MemoryStream, but the same result - the file is 600k when the original was 24k. – Sinaesthetic Feb 06 '14 at 17:28
  • Not sure how optimal this is, but I have a app in production that saves images to an Oracle Blob. In your sample, right after you save the bitmap to the memory stream, get the raw btyes from the stream: 'byte[] imageBytes = ms.ToArray();` and store the byte array in the database. You can use different encodings - when you read it back from the database, it is just the same as if you had done a File.Read to a byte array - note it is a jpg or png, not an Image object yet. You would need to create a stream from the byte[] and load it get an Image. – Doug Feb 06 '14 at 17:37
  • Just to reiterate, reading the memory stream back into an Image decodes the image from some other format (jpg, png, tiff) back into a bitmap - you are inflating the image back to it's original size. – Doug Feb 06 '14 at 17:40
  • Sure, but I also said that when I omit that step and just read the bytes directly out of the stream (without loading it into an `Image`) and observe it before doing anything else to it. I can see that the file size is still way too big. – Sinaesthetic Feb 06 '14 at 17:46
  • When you say the original is 24k - that is the size of png or jpg file, right? What type of object or file are you measuring at 650k? I just made a little test program, and opened a JPG file that is 54k on disk. If I write the bitmap to a memory stream as a bitmap, the raw byte array is 920k. But if I write the same bitmap to a memory stream as jpg level 90, the raw byte array size is 77k - very similar to the original file. – Doug Feb 06 '14 at 18:06
  • Yeah the original jpg/png is 350x350 @ 24k. I then crop the picture, compress it then observe the byte array at 330k, then I scale that crop back to 350x350 and compress, then observe the file at around 620k – Sinaesthetic Feb 06 '14 at 18:13
  • Here is a diagram of the process on what is actually happening, if it helps http://bit.ly/1dtegED – Sinaesthetic Feb 06 '14 at 18:23
  • 1
    Ok, I got it working. At first, it looked like I was saving the bytes correctly in all the right places, but I tried to use `Image` as an intermediate object and I think there were a couple places where I zigged instead of zagged. I eliminated `Image` everywhere that I could and tried to just use the `byte[]` by itself and it seems to have cleared it up. Thanks for your patience :) – Sinaesthetic Feb 06 '14 at 18:50
-2

I believe you shouldn't dispose of the MemoryStream while you are using an image created using Image.FromStream that refers to the stream. Creating a Bitmap directly from the stream also doesn't work.

Try this:

private static Image CropAndCompressImage(Image image, Rectangle rectangle, ImageFormat imageFormat)
{
    using(Bitmap bitmap = new Bitmap(image))
    {
        using(Bitmap cropped = bitmap.Clone(rectangle, bitmap.PixelFormat))
        {
            using (MemoryStream memoryStream = new MemoryStream())
            {
                cropped.Save(memoryStream, imageFormat);
                return new Bitmap(Image.FromStream(memoryStream));
            }
        }
    }
}
Thejaka Maldeniya
  • 1,076
  • 1
  • 10
  • 20