3

I have been looking for a bit and know that people are able to track faces with core image and openGL. However I am not that sure where to start the process of tracking a colored ball with the iOS camera.

Once I have a lead to being able to track the ball. I hope to create something to detect. when the ball changes directions.

Sorry I don't have source code, but I am unsure where to even start.

genpfault
  • 51,148
  • 11
  • 85
  • 139
DTDev
  • 307
  • 4
  • 15
  • Start by capturing the live video stream. Once you have that, see if you can read the individual frames. After that, things get interesting. – Dan Pichelman Apr 12 '13 at 21:17

2 Answers2

2

The key point is image preprocessing and filtering. You can use the Camera API-s to get the video stream from the camera. Take a snapshot picture from it, then you should use a Gaussian-blur on it (spatial enhance), then a Luminance Average Threshold Filter (to make black and white image). After that a morphological preprocessing should be wise (opening, closing operators), to hide the small noises. Then an Edge detection algorithm (with for example a Prewitt-operator). After these processes only the edges remain, your ball should be a circle (when the recording environment was ideal) After that you can use a Hough-transform to find the center of the ball. You should record the ball position and in the next frame, the small part of the picture can be processed (around the ball only).

Other keyword could be: blob detection

A fast library for image processing (on GPU with openGL) is Brad Larsons: GPUImage library https://github.com/BradLarson/GPUImage

It implements all the needed filter (except Hough-transformation)

Peteee24
  • 510
  • 4
  • 8
  • To add to what Peteee24 says here, the ColorObjectTracking example in the above-linked framework does exactly what you want here. You tap and drag to identify a color to track and the threshold size for that color, then the application will find the centroid of pixels containing that color and follow it from the live camera feed. It's not as robust as the process Peteee24 describes, but it works well for simple colored balls, as I show here: http://www.sunsetlakesoftware.com/2010/10/22/gpu-accelerated-video-processing-mac-and-ios – Brad Larson Apr 13 '13 at 19:04
0

The tracking process can be defined as following:

  1. Having the initial coordinate and dimensions of an object with a given visual characteristics (image features)
  2. In the next video frame, find the same visual characteristics near the coordinate of the last frame.

Near means considering basic transformations related to the last frame:

  • translation in each direction;
  • scale;
  • rotation;

The variation of these tranformations are strictly related with the frame rate. Higher the frame rate, nearest the position will be in the next frame.

Marvin Framework provides plug-ins and examples to perform this task. It's not compatible with iOs yet. However, it is open source and I think you can port the source code easily.

This video demonstrates some tracking features, starting at 1:10.

Gabriel Archanjo
  • 4,547
  • 2
  • 34
  • 41