0

I have the following method for rotating images (python):

> def rotateImage(image, angle):
>     row,col = image.shape[0:2]
>     center=tuple(np.array([row,col])/2)
>     rot_mat = cv2.getRotationMatrix2D(center,angle,1.0)
>     new_image = cv2.warpAffine(image, rot_mat, (col,row))
>     return new_image

This is the original picture: Original

This is the rotated (15 degree angle) picture that OpenCV returns: Rotated by CV2 This is the image if I rotate the image around the center in photoshop: Rotated by PS

This are the two roated images superimposed: enter image description here

Obviously there is a difference. I'm pretty sure Photoshop did it correctly (or better - I did it correctly in photoshop), what am I missing?

Jake B.
  • 435
  • 3
  • 13
  • 2
    The X and Y coordinates of your center are swapped. The first parameter of `cv2.getRotationMatrix2D` is the equivalent of `cv::Point`, where X is first and Y is second. – Dan Mašek Nov 05 '18 at 20:09

1 Answers1

0

As pointed out in the comments, the X and Y coordinates of your center are swapped. The first parameter of cv2.getRotationMatrix2D is a point, where X is first and Y is second.

rikyeah
  • 1,896
  • 4
  • 11
  • 21