I realize this is a highly specialized question.. but here goes. I am using an implementation of SIFT to find matches on two images. With the current implementation that I have, when I match an image with is 90 or 180 degree version, I get matches that are off by around half a pixel consistently but its varies within a range. So for example, if a match is found at pixel coordinate (x,y) in im1, then the corresponding match in its 90 degree rotated image im2 is at (x,y + 0.5). If i use a 180 degree image then the offset appears in both x and y coordinates and only in the x if I use a 270 degree (-90) rotated image.
1) First of all, I am assuming SIFT should give me the same matching location in a rotated image. An implicit assumption is that the rotation does not change the pixel values of the image which I confirmed is true. (I use IRFAN View to rotate and save as a .pgm and the pixel values remain unchanged).
2) I have other implementations which do not give this offset.
3) I am assuming this offset is programming related and possibly has to do with conversion from scale-space keypoint coordinates to image-space key-point coordinate.
I'm hoping someone has run across this problem or can point me out to a reference on how to convert from scale-space to image-space.