2

I am using OpenCV iOS SDK.

While capturing with the device preset AVCaptureSessionPresetiFrame1280x720, I would like to get motion vectors from h.264 frame, at 30 fps speed.

I want to get numerical vectors and I believe OpenCV can be helpful. But it's not easy for me to research about it quickly.

What OpenCV cv::Mat methods should I use?

Is it more like finding correct iterator to extract motion vectors?

Does it involve encoding captured video data into h.264 frame before doing anything?

petershine
  • 3,190
  • 1
  • 25
  • 49

2 Answers2

3

OpenCV doesn't implement the h.264 encoder but uses external libraries to compress videos, you can see how it is done on ios here. If you want to extract the motion vectors, you need to find a h.264 decoder that returns them or modify an open source one to do so.

This project probably can help you.

fireant
  • 14,080
  • 4
  • 39
  • 48
1

OpenCV can calculate motion vectors directly: you should take a look at the Optical FLow algorithms.

To get going with iOS and openCV, you might want to check out my starter project (Swift or Objective-C). As the C++ openCV code is cleanly separated from Swift and Objective-C, you can dump the photo-stitching part in the example quite easily and replace it with some optical-flow processing - this sample code from the openCV distribution may help.

I have no idea if you will achieve your required framerate doing it this way, and it certainly won't give you the same result as H.264-encoded motion vectors. Extracting them directly from H.264-encoded files is a really tempting way to go (as I see you are trying). Like you, I am trying to solve this using AVFoundation. If I get there, I will post a reply to your other question.

Community
  • 1
  • 1
foundry
  • 31,615
  • 9
  • 90
  • 125