I would like to compute the Jacobian of some object point coordinates with respect to the extrinsic and intrinsic parameters of a camera. To do so, I'm using openCV
's projectPoints
, which returns the Jacobian as an optional argument.
Here is the openCV
documentation on the function:
imagePoints, jacobian = cv.projectPoints( objectPoints, rvec, tvec, cameraMatrix, distCoeffs[, imagePoints[, jacobian[, aspectRatio]]] )
When I call projectPoints(objectPoints, rvec, tvec, cameraMatrix, distCoeffs)
, I get this error:
cv2.error: OpenCV(4.4.0) /tmp/opencv-20200726-67088-i69mik/opencv-4.4.0/modules/calib3d/src/calibration.cpp:3558: error: (-215:Assertion failed) npoints >= 0 && (depth == CV_32F || depth == CV_64F) in function 'projectPoints'
I have read through this SO post as well as this Reddit post, which both suggested checking the data types and sizes of the first argument being passed into projectPoints
.
All of my variables are ndarray
s and their sizes match the documentation requirements.
I have also read that it could be due to a problem with the actual coordinates of the points, meaning that the points don't fit in the image frame and so a solution cannot be computed, but I am unsure of how to fix that analytically. I am trying to simulate a set of object points seen by a camera in order to get the image points.
Is there any mathematical relationship that must be conserved between objectPoints
, rvec
, tvec
, cameraMatrix
and distCoeffs
? If not, are there any typical / standard values that I could use as a starting point?
Here is my code:
import cv2
import numpy as np
from numpy.linalg import inv
from numpy import shape, tile, matmul, zeros, transpose
# SIMULATE SOME POINTS
# Rotation Matrix
R = np.eye(3)
# Translation vector
t = np.array([[1],[0],[0]])
# Camera calibration matrix
K = np.array([[1, 0, 1], [0, 1, 1], [0, 0, 1]])
# World frame coordinates of the points
Xw = np.array([[0,0,0,0], [1,1,1,0], [2,2,2,0], [3,3,3,0], [4,4,4,0], [5,5,5,0]]).T
Xw = Xw.astype('float64')
# [R|t]
Rt = np.concatenate((R,t), axis=1)
# Distortion coefficients
dist_coeffs = np.array([1, 1, 1, 1, 1])
# GET THE JACOBIAN TO SOLVE THE LEAST SQUARES EQUATION FOR t
# Transpose Xw to fit projectPoints()
Xw = Xw[:3,:].T
# Compute the Jacobian
_, H = cv2.projectPoints(Xw, R, t, K, dist_coeffs)
Note - I am using Python 3.8.5 and OpenCV 4.4.0, with numpy 1.19.1.
Edited to add runnable code.
Edited to remove dead code in snippet.