I'm trying to get basic motion tracking working to be later used in an raspberrypi/arduino project. I don't know very much python yet but I can wrap my head around the logic of whats going on pretty well. I've been using some examples to try and get it working with my laptops built-in camera but it seems to be tracking the entirety of the image even when I'm outside the first frame. My guess is that the low-resolution (640x480) and frame rate (6 fps) is causing jitter, and the differences these frames from the jitter is what it's attempting to track. From what I've read the gaussianblur is supposed to take care of this- but it's not. The code seems to compile, I can see the multiple types of processing taking place in multiple windows and there is some motion-detection going on but its very inconsistent and I can't troubleshoot whats going wrong.
import cv2,time
first_frame = None
video = cv2.VideoCapture(0)
a = 1
while True:
a = a + 1
check, frame = video.read()
print (frame)
gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)
gray = cv2.GaussianBlur(gray, (21, 21), 0)
if first_frame is None:
first_frame = gray
continue
delta_frame = cv2.absdiff(first_frame, gray)
thresh_delta = cv2.threshold(delta_frame, 25, 255, cv2.THRESH_BINARY)[1]
thresh_delta = cv2.dilate(thresh_delta, None, iterations=2)
(_, cnts, _) = cv2.findContours(thresh_delta.copy(), cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)
for contour in cnts:
if cv2.contourArea(contour) < 1000:
continue
(x, y, w, h) = cv2.boundingRect(contour)
cv2.rectangle(frame, (x, y), (x+w, y+h), (0, 255, 0), 2)
cv2.imshow('captureFrame', frame)
cv2.imshow('captureGrey', gray)
cv2.imshow('delta', delta_frame)
cv2.imshow('thresh', thresh_delta)
key = cv2.waitKey(1)
if key == ord('q'):
break
print(a)
video.release()
cv2.destroyAllWindows()
EDIT: it seems to be a hardware problem regarding auto-lighting? Cannot confirm. But buying a cheap Microsoft lifecam VX 2000 seemed to resolved the issue.