2
PIX* returnRotatedImage(PIX* image, float theta)
{
    PIX* rotated = pixRotate(image, -theta, L_ROTATE_AREA_MAP, L_BRING_IN_BLACK, image->w, image->h);

    return rotated;
}

When I execute the above code on an image, the resulting image has the edges cut off.

Example: the original scan, followed by the image after being run through the above function to rotate it by ~89 degrees.

I don't have 10 reputation yet, so I can't embed the images, but here's a link to the two pictures: https://i.stack.imgur.com/QHoLS.jpg

I need it to work for arbitrary angles as well (not just angles close to 90 degrees), so unfortunately the solution presented here won't work.

The description for the pixRotate function says:

 *      (6) The dest can be expanded so that no image pixels
 *          are lost.  To invoke expansion, input the original
 *          width and height.  For repeated rotation, use of the
 *          original width and height allows the expansion to
 *          stop at the maximum required size, which is a square 
 *          with side = sqrt(w*w + h*h).

however it seems to be expanding the destination after rotation, and thus the pixels are lost, even if the final image size is correct. If I use pixRotate(..., 0, 0) instead of pixRotate(..., w, h), I end up with the image rotated within the original dimensions: https://i.stack.imgur.com/hUZtq.jpg.

Am I interpreting the pixRotate function description incorrectly? Is what I want to do even possible? Or is this possibly a bug?

Thanks in advance.

Community
  • 1
  • 1

0 Answers0