1

I have written a script that uses OpenCV to detect keypoints on 3 consecutive frames, then finds the points that are matched in all 3 images. What I try to do is to estimate a motion vector by monitoring the coordinates of each matched point. Then for each matched point I use the three coordinate pairs (frame1, frame2, frame3) and fit a 3rd degree polynomial model using numpy.polyfit. I loop over this for every point and then I store the 4 polynomial coefficients, calculated from each matched point, in 4 lists and finally I average the values in the lists to come up with the 'best'(?) model. The problem lies in the fact that there are outliers in the dataset that increase the standard deviation of the coefficients leading in a model that doesn't fit the data adequately. My question is if there is a function or any other way to get rid of the loop. Below is the script's part that does what I described above:

for i in range(len(matc1)):
    x_vs = np.array([matc1[i][0, 0], matn2[i][0, 0], matn3[i][0, 0]])
    y_vs = np.array([matc1[i][0, 1], matn2[i][0, 1], matn3[i][0, 1]])
    coeffs = np.polyfit(x_vs, y_vs, deg=3)
    c1.append(coeffs[0])
    c2.append(coeffs[1])
    c3.append(coeffs[2])
    c4.append(coeffs[3])

c1 = sum(c1)/len(c1)
c2 = sum(c2)/len(c2)
c3 = sum(c3)/len(c3)
c4 = sum(c4)/len(c4)
coeffs = np.array([c1, c2, c3, c4])
Nikos
  • 198
  • 1
  • 9

0 Answers0