0

I'm trying to make a program that detects people in CCTV footage and I've made a lot of progress. Unfortunately, the amount of noise in the videos varies a lot between different cameras and time of day, so with each of the sample videos. This means that the NoiseSigma needed varies from 1-25.

I've used a fastNlMeansDenoisingColored function and that helped a bit, but the NoiseSigma is still an issue.

Would it be effective to maybe loop through the video once, and somehow get an idea for how noisy the video is and make a relationship for noise vs NoiseSigma? Any ideas would be welcome.

Kieran
  • 73
  • 7

1 Answers1

0

I don't think it's possible to determine noise level in an image (or video) without having a reference data which doesn't contain any noise. One thing that comes to my mind is to record some static scenery and measure how all the frames differ between each other and then try to find some relationship (hopefuly linear) between the measure and NoiseSigma. If there was no noise, the accumulated difference between frames would be 0. By accumulated difference I mean something like this:

for i=1, i<frames.count(), ++i
{
  cumulativeError += sum(abs(frame(i) - frame(i-1)))
}
cumulativeError/=frames.count()

Where sum adds up all elements of an image (frame) to produce scalar value. Please keep in mind that I'm just following my intuition here and it's not a method I've seen before.

Max Walczak
  • 433
  • 2
  • 12