0

I want to create a cocoa application for mac os x and use blob detection from a camera input in order to process gestures. So far I have installed OpenCV and also the library cvBlob but I have no idea what to from now and so far I couldn't find any information.

I need to process a video input and get x and y positions of blobs and be able to use those in a cocoa application.

matteok
  • 2,189
  • 3
  • 30
  • 54

1 Answers1

1

The "red object tracking" sample file in the "samples" directory of cvblob is a good point to start. You'll have to :

  • convert your image to gray (if it isn't already)
  • threshold it (binary, the white zone must be your interesting blob)
  • make CvBlobs from your image
  • feed CvTracks to track your blobs
  • render your blobs if you want (cvRenderBlobs)

Please note that you mustn't create new tracks at each tick. Your CvTracks object must be declared outside of your execution method.

It's quite easy, look at the file.

Benoît Lahoz
  • 1,270
  • 1
  • 18
  • 43
  • Hi, I am interested with cvBlob. Is it possible track object that full of features by using cvTracks? I can't find cvblob explanation in opencv documentation (version 2.4.3). Many thanks! – flyzhao May 28 '13 at 01:49
  • CvBlob is a particular lib based on OpenCV, you have to download and compile it. It doesn't track features but blobs (contours). – Benoît Lahoz May 28 '13 at 06:29