-1

So here is the context.

I created an script in python, YOLOv4, OpenCV, CUDA and CUDNN, for object detection and object tracking to count the objects in a video. I intend to use it in real time, but what real time really means? The video I'm using is 1min long and 60FPS originally, but the video after processing is 30FPS on average and takes 3mins to finish. So comparing both videos side by side, one is clearly faster. 30FPS is industry standard for movies and stuff. I'm trying to wrap my head around what real time truly means.

Imagine I need to use this information for traffic lights management or use this to lift a bridge for a passing boat, it should be done automatically. It's time sensitive or the chaos would be visible. In these cases, what it trully means to be real time?

Christoph Rackwitz
  • 11,317
  • 4
  • 27
  • 36

1 Answers1

1

First, learn what "real-time" means. Wikipedia: https://en.wikipedia.org/wiki/Real-time_computing

Understand the terms "hard" and "soft" real-time. Understand which aspects of your environment are soft and which require hard real-time.

Understand the response times that your environment requires. Understand the time scales.

This does not involve fuzzy terms like "quick" or "significant" or "accurate". It involves actual quantifiable time spans that depend on your task and its environment, acceptable error rates, ...

You did not share any details about your environment. I find it unlikely that you even need 30 fps for any application involving a road intersection.

You only need enough frame rate so you don't miss objects of interest, and you have fine enough data to track multiple objects with identity without mistaking them for each other.

Example: assume a car moving at 200 km/h. If your camera takes a frame every 1/30 second, the car moves 1.85 meters between frames.

  • How's your motion blur? What's the camera's exposure time? I'd recommend something on the order of a millisecond or better, giving motion blur of 0.05m
  • How's your tracking? Can it deal with objects "jumping" that far between frames? Does it generate object identity information that is usable for matching (association)?
Christoph Rackwitz
  • 11,317
  • 4
  • 27
  • 36
  • The camera is stacionary, there is no motion blur. I'm tracking using euclidean distance. First I identify uniquely an object and use it's centroid to calculate the distance of the movement between frames, if the distance is higher than a set number of pixels then it's not the same object, if its lower then its the same one. – Ivan Trigueiro Dec 04 '22 at 19:55
  • motion blur comes from moving objects too. I'll just assume you haven't noticed because your objects move slowly enough. – Christoph Rackwitz Dec 04 '22 at 20:35