3

So... working through the book Practical Computer Vision with SimpleCV, Chapter 5 - same example as in the online tutorial here. Got a very different value for the mean color of the yellow car, so I went back through the example code, inserted some more comments, added some stuff to display (and then close out cleanly) the images at each stage.

from SimpleCV import Image
import time

# Load images.
car_in_lot = Image("parking-car.png")
car_not_in_lot = Image("parking-no-car.png")

# Crop image to region-of-interest.
car = car_in_lot.crop(470,200,200,200)
car.show()
time.sleep(5)
car.show().quit()

# Create greyscale image showing how far from yellow various colors are.
yellow_car = car.colorDistance(Color.YELLOW)
yellow_car.show()
time.sleep(5)
yellow_car.show().quit()

# Subtract greyscale image from cropped image to show just the yellow portions.
only_car = car - yellow_car
only_car.show()
time.sleep(5)
only_car.show().quit()

print only_car.meanColor()

which returns a result of (0.6376000000000001, 2.096775, 5.170425), instead of (25.604575, 18.880775, 4.4940750000000005) as given in both the book and tutorial.

The first image of the cropped parking spot with car looks fine... but the greyscale image is where things look decidedly odd. The image I get is rotated 90 degrees, and doesn't look at all like the one in the examples. Here's a link to it on Dropbox.

And from there... with the colorDistance so far off what it should be... the mean color values don't come out right.

Any ideas or suggestions as to why the colorDistance() step returned a funky rotated greyscale image like it did?

memilanuk
  • 3,522
  • 6
  • 33
  • 39

1 Answers1

1

colorDistance appears to return a rotated, flipped image. If you do a quick transform, you can avoid such upsets. e.g.

x,y,w,h   = 470,200,200,200
cImg      = Image('parking-car.png')
ncImg     = Image('parking-no-car.png')
car       = cImg.crop(x,y,w,h)
ncar      = nImg.crop(x,y,w,h)
ycar      = car.colorDistance(Color.YELLOW).rotateRight().flipHorizontal()
nycar     = ncar.colorDistance(Color.YELLOW).rotateRight().flipHorizontal()
only_car  = car - ycar
nonly_car = ncar - nycar 
carmc     = only_car.meanColor()
ncarmc    = nonly_car.meanColor()

print "yellow car present, mean color:", carmc
print "no yellow car present, mean color", ncarmc

As to the meancolor being different, I would assume that either the image has been adjusted slightly or the value of Color.YELLOW has changed...

As an aside, if you are comparing two images which have had the colorDistance method called on them (or subtracting one crop from another), then they have both been transformed the same way, so you only need to do the rotateRight().flipHorizontal() on the final image before it is shown (if at all).

user3258790
  • 101
  • 5