1

I wish to reduce image size around 200KB before save into database. Let say I have a bitmap file around 39MB, I need to convert Bitmap to Jpeg and then reduce the Jpeg file <= 200KB (all type of graphics file are possible to convert (e.g bmp, jpg, png), but final graphic type will be Jpeg lesser than 200KB). I'm writing the following code try to convert (in this case i set the jpg quality to 10 since i want the file size to be 200KB):

    function BMPtoJPG(var BMPpic, JPGpic: string):boolean;
    var Bitmap: TBitmap;
        JpegImg: TJpegImage;
    begin
      Result:= False;
      Bitmap := TBitmap.Create;
      try
        Bitmap.LoadFromFile(BMPpic) ;
        JpegImg := TJpegImage.Create;
        try
          JpegImg.CompressionQuality := 10; //in this case i set the quality to 10 so the file size will be around 200KB)
          JpegImg.Assign(Bitmap);
          JpegImg.Compress;
          JpegImg.SaveToFile(JPGpic) ;
          Result:=True;
        finally
          JpegImg.Free
        end;
      finally
        Bitmap.Free;
      end;
    end;

I use the same image file to convert in Adobe Lightroom program in export dialog limit the size to 200KB and compare with the image converted by the above function BMPtoJPG. The quality image from Adobe Lightroom is much more better than the function method (both file size around 200KB)

Can anyone show me how to write code reduce image size (limit size to 200KB) while the quality doesn't drop much.

Thanks and appreciate for your answer.

Sherlyn Chew
  • 189
  • 1
  • 2
  • 8
  • 1
    You may want to try jpg 2000 or JBIG which provide better compression compared to JPG. – Graymatter Jul 08 '14 at 09:18
  • Why do you want to store the image in database at all? – LightBulb Jul 08 '14 at 09:49
  • 1
    Users able to store images of each stock item into database – Sherlyn Chew Jul 08 '14 at 10:06
  • @SherlynChew The **usual** solution in cases like yours is storing the image files in the file system on a _file server_ (it might be, but not necessarily the same as the database server) and store only the file system paths in your database. (_Compression would still be nice though._) – mg30rg Jul 08 '14 at 11:00
  • A 39MB bitmap is going to be very large (height and width). Does this database require such a large image? If not, resize the image smaller and then save to JPG with a higher compression ratio. – MikeD Jul 08 '14 at 11:58
  • 2
    Oh, come on! There is nothing bad in storing images asn BLOBs. It works faster than filesystem even (space suffers, tho). – Free Consulting Jul 08 '14 at 12:50
  • 1
    see also http://www.efg2.com/Lab/Graphics/BMPJPG.htm – Free Consulting Jul 08 '14 at 13:06
  • @FreeConsulting I did not say storing images in the database is plainly wrong, only that it is not the usual solution. And it is not the usual solution for a reason. – mg30rg Jul 08 '14 at 14:12
  • @FreeConsulting There are both pros and cons to storing images in a DB. A couple of cons as an example are the additional backup implications when a large number of images are stored in the database and the increased complexity around adding new storage. There is no silver bullet saying one approach is better than another. – Graymatter Jul 08 '14 at 19:04
  • @FreeConsulting For huge content (more than some MB), *local files storage* will always be faster than BLOBs on remote server. Some entreprise-level DBs have a maximum data size linked to the license/price. But for remote centralized storage, BLOBs are indeed a very good option, probably better than *remote file storage*. – Arnaud Bouchez Jul 09 '14 at 09:58
  • @Graymatter, your examples are kinda weak and related to makeshift solutions. Typically, in such environments regular backup procedures is replaced by *hope what nothing bad gonna happen*. Backing up data without images does not qualifies as full backup even, because in case of disaster images will be lost forever. High-availability storage, well, is definitely much more expensive than backup media, so as I said *space suffers*. But that all the increased complexities around adding new storage. – Free Consulting Jul 09 '14 at 19:58
  • @ArnaudBouchez, yes, indeed local is faster than remote, but in this setup you going to face unspeakable difficulties with synchronizing local and remote data. Good point with increased storage engine **software** cost. By the way, off-site links on your IJL page are broken. Care to fix them? – Free Consulting Jul 09 '14 at 20:02
  • @mg30rg, its an **usual** case when people do not backup their data at all. Bad design unlikely qualities as reason. – Free Consulting Jul 09 '14 at 20:07
  • @FreeConsulting I am not talking about not backing up the images. I am saying that backing up images on a file system is much simpler than doing the same in a database. I have experience doing both. I can tell you that managing and backing up 3TB of images in a database is an absolute nightmare. – Graymatter Jul 09 '14 at 22:55
  • @Graymatter, I can tell you otherwise. The only advantage of *file soup* is what it allows **hot** backup by definition (at cost of losing *unknown* amount of integrity during backup procedure), while backup of data/control file groups *sometimes* requires **cold** backup. Needless to say, 24/7 live DB storage should be in transactioned mode (eg: Oraclish ARCHIVELOG) to allow **hot** backups. Also it worth to mention tons of absolutely useless file metadata which is pretty slow to backup (and read, in general). – Free Consulting Jul 09 '14 at 23:33
  • @Graymatter Can you please tell me **why** would it be **unable to backup image information from the file system?** Or why is it better to have a multi-gigabyte SQL database swarmed with unindexable BLOBS? – mg30rg Jul 11 '14 at 09:27

2 Answers2

1

If your goal is to have a smaller file content, you can first resize the bitmap, then apply a better CompressionQuality rate.

As a result, the global quality of the image would probably be better.

The TJpegImage relies on the standard libjpeg library, whereas I suppose that Adobe has its own tuned implementation. You can try other implementations, like the one included in GDI+, or a pure pascal implementation or even old Intel's library.

Arnaud Bouchez
  • 42,305
  • 3
  • 71
  • 159
  • Notes: 1) Delphi's libjpeg version is rather old. 2) Yes, Adobe's implementation actually differs from IJG's (see http://fotoforensics.com/tutorial-ela.php#Rainbowing) 3) It is not possible to predict how much image information DCT will discard when compressing an arbitrary image w/o writing own implementation or resorting to trial-and-error method. – Free Consulting Jul 08 '14 at 12:36
1

See if your encoder support sampling. If you sample your Cb and Cr components at a lower rate than the Y component, you can often get better compression with out a lot of quality loss. If you do 4:1:1 you cut the amount of data to compress nearly by a third. 2:1:1 cuts it by half.

This technique called chroma subsampling. Unfortunately, Delphi's stock TJpegImage wrapper around IJG implementation ver. 6b does not expose this capability and initializes encoder context to all-default values which corresponds to 4:2:0. Refer to TJPEGImage.Compress method code, libjpeg documentation and jpeglib.h or JPEGLIB.PAS from the pasjpeg package, if you prefer (latter is merely straightforward translation of IJG implementation ver. 6a to Pascal).

Also, you can migrate to the one of other implementation. In addition to @Arnaud Bouchez list, it is worth to mention yet another Pascal implementation available to reuse: http://www.colosseumbuilders.com/sourcecode

Free Consulting
  • 4,300
  • 1
  • 29
  • 50
user3344003
  • 20,574
  • 3
  • 26
  • 62