3

The use case: an user uploads an image by the browser, the app should take that image convert it to progressive and upload it to the server, so another app can consume that and make a progressive load of the image.

I tried some libreries like 'gm' what user imagemagic but they didn't work, so I want to know if there is some way that I can do this.

Peter O.
  • 32,158
  • 14
  • 82
  • 96
  • 1
    I assume you tried [the code in this answer](https://stackoverflow.com/a/55792256/3002584). What didn't work for you? – OfirD Jan 27 '21 at 16:37
  • What do you mean about the app? Are you expecting the load of the conversion to be done in the browser or the backend? If you are looking for browser conversion I think compressorjs is a good option. – Jose Marin Jan 28 '21 at 11:25
  • If you are planning to compress the image in browser, this might be a good solution. https://www.npmjs.com/package/compressorjs, https://www.npmjs.com/package/browser-image-compression – Swaraj Gandhi Feb 01 '21 at 16:58

3 Answers3

2

You can use the sharp module it has a lot of functionality and is really good. If you just want to convert to progressive jpeg then

const sharp = require("sharp")

const main = async () => {
  try {
    const data = await sharp('10.jpg')
      .jpeg({
        quality: 100, // adjust the quality
        progressive: true
      })
      .toBuffer();
    await sharp(data).toFile('output.jpg')
  } catch(error) {
    console.log(error)
  }
}

main()

Repl link for the code.

Bibash Adhikari
  • 292
  • 4
  • 16
1

I have another functional example of when an image enters in bucket S3, it's triggered by a lambda (Node JS) that also uses lib sharp to optimize the image size, and then, publishes the new image in another folder in s3. Here is an example:

"use strict";

const AWS = require("aws-sdk");
const sharp = require("sharp");
const { basename, extname } = require("path");

const S3 = new AWS.S3();

module.exports.handle = async ({ Records: records }) => {
  try {
    await Promise.all(
      records.map(async (record) => {
        const { key } = record.s3.object;

        const image = await S3.getObject({ Bucket: process.env.bucket, Key: key }).promise();

        // blob
        const optimized = await sharp(image.Body)
          .resize(1280, 720, {
            fit: "inside",
            withoutEnlargement: true,
          })
          .toFormat("jpeg", {
            progressive: true,
            quality: 50,
          })
          .toBuffer();

        await S3.putObject({
          Body: optimized,
          Bucket: process.env.bucket,
          ContentType: "image/jpeg",
          Key: `compressed/${basename(key, extname(key))}.jpg`,
        }).promise();
      })
    );

    return {
      statusCode: 301,
      body: { ok: true },
    };
  } catch (error) {
    return error;
  }
};
1

Try WebP instead, while not designed as progressive can be used in conjunction with lazy-loaders. It's a far superior format for compression (a 1mb jpeg file can end up as just 300kb).

Consider this:

  • 24kb placeholder
  • Lazy Load with Fade In Effect -
  • 300kb webp

Try the package webp-loader that can be used in webpack builds.