0

I am using the v7.Palette Api to extract a image color ,but color extracted as almost 50 % inaccurate , I want to extract the most prominent color from the image.The code i am using

  final Palette palette = Palette.from(source).generate();
               if (palette != null) {
                   Palette.Swatch vibrant;
                              palette.getDarkMutedSwatch() :
                              palette.getLightMutedSwatch();
                              palette.getMutedSwatch();                                                            palette.getDarkVibrantSwatch() :
                              palette.getLightVibrantSwatch();
                              palette.getVibrantSwatch();
}

I tried all methods but none extracted color accurately for more than 50%.

Is There any method to do it?

JAAD
  • 12,349
  • 7
  • 36
  • 57

1 Answers1

1

Palette class is not made to accurately extract pixel colors from an image, rather its purpose is to give and aesthetically pleasant set of colors that can be used for UX purpose only.

The only things you can do are:

  1. Increase the number of colors used by the Palette algorithm.
  2. Increase the size to which the largest dimension of the image is scaled down to.

You can tweak the builder settings before generating the palette:

final Palette palette       
    = Palette
        .from(source)
        .maximumColorCount(numerOfColors)
        .resizeBitmapSize(bitmapLargestDimension)
        .generate();

Beware that generating palette is very expensive!
You are using the synchronous version, that block the calling thread, be sure of what you are doing, especially if you increase the settings above.


For information purpose only, here a summary of what Palette does.

I cannot find the latest source of the Palette class, I assume that some of the code originally in Palette has been refactored into Palette.Builder.

If you look at the source code of the generate method you will see that

  1. The image is scaled down so that its largest dimension fit 100 pixels (by default)
  2. The image is quantized1 so that the number of colors is 16 (by default)

The quantizier does not return an image, but rather the list of remaining colors wrapped in a class called Swatch (which offer more semantic than a simple int).

After that an instance of palette is built and the profile swatches (Vibrant, Vibrant Dark, Vibrant Light, Muted, Muted Dark, Muted Light) are searched among the swatches returned by the quantizier.

A profile define a range of acceptable values for saturation and luma along with the ideal saturation and luma values.
Search is performed by looking for swatches that fall within a profile range.

Since multiple swatches can match, a weighting function is computed.
Such function gives an higher score to swatches that have saturation and luma values closer to the ideal ones and represent more pixels.
Saturation matching accounts twice as luma matching that accounts as three times as population counting.


1 Quantization does not work by by reducing the color space (e.g. from going to 24 bit per pixel to 4 bit per pixel) but rather but averaging colors (along the largest dimension) until their number is below or equal the threshold given.
See ColorCutQuantizier source

Margaret Bloom
  • 41,768
  • 5
  • 78
  • 124
  • i am doing it in a thread that is why i call it synchronous @margaret – JAAD Feb 27 '16 at 09:38
  • but i need a method to get most accurate colors as some apps on market does that – JAAD Feb 27 '16 at 09:42
  • You can get the pixels of the image and do whatever you want with them. Look at [these lines of code](http://grepcode.com/file/repository.grepcode.com/java/ext/com.google.android/android/5.1.1_r1/android/support/v7/graphics/ColorCutQuantizer.java#72). – Margaret Bloom Feb 27 '16 at 10:01
  • thank you , i didnt know about the bitmapresize it clearly solves my issue,but is slower as expected i try setting to 1000 for better results and i got but still that will be slower but you were very helpful , so i accept your answer – JAAD Feb 27 '16 at 11:00