In part of my project I need to compute orientation of a patch in an affine transformed image. Now my problem is that I don't know how can I find this computed orientation with respect to original un-warped image.
For example a point in warped image is found(100,200). I can extract orientation of this point using 8x8 neighboring pixels. suppose it is 30 degree. the warped image in which the point has found is result of applying transformation on original image with 60 degree in each axis.(pitch,yaw and roll).(This orientation extraction is usually known as descriptor extraction in computer vision)
Now the transformation matrix is known. orientation of the point in the transformed image is known as well. The position of the point wrt reference is known(using inverse transformation). Now I want to know what is the new orientation if this point(100,200) goes to reference frame(e.g. 150,250). in another word, what is the new orientation with respect to reference image.
I know this is straight forward to solve if we just have roll angle rotation(60 degree). in this case the orientation wrt reference frame would be 30+60 = 90 .
I tried implement this using OpenCV:
cv::Mat rvec1(3,1,CV_32F); // rot vector related to B
rvec1.at<float>(0,0)=0;
rvec1.at<float>(1,0)=30*to_RAD;
rvec1.at<float>(2,0)=0;
cv::Mat rvec2(3,1,CV_32F); // rot vector related to A
rvec2.at<float>(0,0)=0;
rvec2.at<float>(1,0)=60*to_RAD;
rvec2.at<float>(2,0)=0;
cv::Mat R_A;
cv::Mat R_B;
cv::Rodrigues(rvec1, R_B);
cv::Rodrigues(rvec2, R_A);
cv::Mat R_combined= R_B*R_A;
cv::Mat rvec_result;
cv::Rodrigues(R_combined,rvec_result);
I want to create rotation Mat A and B using 2 rotation vector. and after multiplying these two I want to convert it into rotation vector.But only thing I get is a runtime error on the last line (cv::Rodrigues(R_combined,rvec_result);)
Thank you for your help in advance.