0

In the OpenCV implementation of SIFT, keypoints has (angles) in degrees (ranging from 180 to -180), which represents the calculated orientations for these keypoints. Since SIFT assign the dominant orientation of a keypoint using 10 degrees bins in a histogram, how we can get this range of angles? shouldn't the values be in 10 degrees steps?

Is that so because of the histogram smoothing?

This is the code where the keypoint.angle is assigned a value, can you help me understanding how we got this value?

float omax = calcOrientationHist(gauss_pyr[o*(nOctaveLayers+3) + layer],
                                                     Point(c1, r1),
                                                     cvRound(SIFT_ORI_RADIUS * scl_octv),
                                                     SIFT_ORI_SIG_FCTR * scl_octv,
                                                     hist, n);
                    float mag_thr = (float)(omax * SIFT_ORI_PEAK_RATIO);
                    for( int j = 0; j < n; j++ )
                    {
                        int l = j > 0 ? j - 1 : n - 1;
                        int r2 = j < n-1 ? j + 1 : 0;

                        if( hist[j] > hist[l]  &&  hist[j] > hist[r2]  &&  hist[j] >= mag_thr )
                        {
                            float bin = j + 0.5f * (hist[l]-hist[r2]) / (hist[l] - 2*hist[j] + hist[r2]);
                            bin = bin < 0 ? n + bin : bin >= n ? bin - n : bin;
                            kpt.angle = 360.f - (float)((360.f/n) * bin);
                            if(std::abs(kpt.angle - 360.f) < FLT_EPSILON)
                                kpt.angle = 0.f;
                            keypoints.push_back(kpt);
                        }
                    }
  • Whilst this question is interesting it's not strictly a programming question and belongs on another site, possibly suitably for http://cs.stackexchange.com/ – EdChum Mar 08 '17 at 09:10
  • I think that you are right, thank you for the notification :) – Hussein Adnan Mohammed Mar 08 '17 at 09:12
  • it's probably worth reading the original paper that describes the descriptors, my understanding is that the descriptors are scale and orientation invariant but you'd need to still know the angles such that you can determine if the descriptors are describing the same feature – EdChum Mar 08 '17 at 09:14
  • I read the original paper, the orientation is calculated for the detected keypoints even before calculating the descriptors. The same is true for OpenCV implementation. My question is bout the interpretation of the values of angles, which I couldn't find neither in OpenCV documentation nor in the original paper. – Hussein Adnan Mohammed Mar 08 '17 at 09:52
  • 1
    From my knowledge and intuition: The descriptor is computed for a "normalized" keypoint, so first the keypoint's orientation is determined, then the descriptor is computed for the keypoint neighborhood as it would look like if the image was rotated, so that the new keypoint's orientation would be 0 degrees. So if the descriptor holds information about gradients in 10 degree steps and you want to know the actual gradient orientation in the image, you'll have to "add" the orientation of the keypoint itself. – Micka Mar 08 '17 at 10:08
  • ah ok, your question is just about the keypoint orientation? Here is some small explanation: http://opencv-python-tutroals.readthedocs.io/en/latest/py_tutorials/py_feature2d/py_sift_intro/py_sift_intro.html you are right, that 10 degree steps seem to be evaliated. The angle itself is accessible in the keypoint class: float value that is given in degrees (0 to 360 and -1 for not being set). http://docs.opencv.org/2.4/modules/features2d/doc/common_interfaces_of_feature_detectors.html#keypoint – Micka Mar 08 '17 at 10:29
  • Only the chosen angle is accessible, the intermediate result of 36 orientation bins isn't saved, but any bin that was > 80% of the highest bin should generate a second keypoint with same position but different angle. – Micka Mar 08 '17 at 10:30
  • 1
    can you precise your question? Is it like "keypoints have angles like 13 degree, how is this possible if only 10 degree steps are evaluated?" - if that's the case (I dont know) I guess there might be some kind of interpolation between the orientation bins to approximate a better maximum. Maybe similar approaches like "sub pixel precision interpolation" (e.g. fit a spline to some neighborhood and search for local optima). – Micka Mar 08 '17 at 10:36
  • This is what I mean exactly Micka, angles like 13. That's why I suspected that there is some sort of histogram smoothing. – Hussein Adnan Mohammed Mar 08 '17 at 11:04

1 Answers1

0

I think that I found the answer to my question.

A parabola is fit to the 3 histogram values closest to each peak to interpolate the peak position for better accuracy. That's why we can get continues range of values instead of 10 step values.

This is a link of how we can fit a parabola to 3 points: Curve fitting

Community
  • 1
  • 1