When it comes to comparing colors during image analysis, you'll soon discover that you can just use grayscale image. Why? Because usually you do this:
double average = (color.r+color.g+color.b)/3;
Based on grascale average colors, I made an algorithm that is actually quite satisfying when it comes to find an object on screen (I used whole desktop, but well, this is enough):
Search by average color took 67ms while searching by exact pixel match (blue frame) took 1.255 seconds! (and the former terminated right after finding first match, while the average color algorithm loops whole image).
But I wanted to improve precision on GUI's. In GUI red button looks just like blue button and may be matched wrongly. This is why I implemented color-sensitive integral image. Now I discovered that I don't know how to properly compare the color sums to get some real color difference.
So imagine you have 2 arrays of 3 elements.
//Summed colors on the image you're looking for
double sumOnSearchedImage[3];
//Summed colors on currently checked rectangle (in some loop we'll not bother with here)
double sumOnBigImage[3];
Every number in the arrays represents red, blue and green sum (not average) respectively. How do you compare these so that difference between rgb(0, 255, 255)
and rgb(255,255,255)
is larger than difference between rgb(170,170,170)
and rgb(255,255,255)
?