I want to create a cocoa application for mac os x and use blob detection from a camera input in order to process gestures. So far I have installed OpenCV and also the library cvBlob but I have no idea what to from now and so far I couldn't find any information.
I need to process a video input and get x and y positions of blobs and be able to use those in a cocoa application.