I'm working on a color quantization algorithm.
This is the basic process:
- Convert the image to a set of three dimensional vectors (RGB space for instance).
- Put that set in a list of sets.
- While the number of sets in the list is less than the number of color you want:
- Remove the worst set from the list.
- Split it in two.
- Add the two new sets in the list.
- Done.
What I mean by "worst set" is the set where the accumulated distance between each vector and the mean vector is the bigger.
And this is how I "split a set":
- Compute the mean vector by adding all vectors and dividing by vector count.
- Compute a vector composed of the absolute differences between each vector and the mean vector. Normalize it and we got the normal of a plane that divide our vector set in two equal halves.
- Use this normal two split the set in two sets depending on which side of the plane the vectors belong.
This works, basically, but image palette looks odd, like picked out of a linear gradient...
Is my algorithm plain wrong ? Can someone help ?