I need to draw a selection rectangle to specific aspect ratios (square, 4x3, 5x7, 8x10, etc) so I can crop the image. I've got some pretty normal code that handles dragging the 4 corners of the rectangle freely inside touchesBegan/Moved, but I'm having a hell of a time getting it to constrain to specific aspect ratios without it creeping, or not being accurate.
Here's some pseudo-code of what I currently have:
// this assumes that the four hitPoints are already arranged in the desired aspect ratio
CGPoint hitPoints[4]; // arranged botLeft=0, botRight=1, topRight=2, topLeft=3
CGPoint lastPoint;
CGRect cropRect;
int activeHitPoint;
#define kMinCropSize 20
- (void) touchesBegan
{
lastPoint = [touches locationInView:self];
activeHitPoint = [self getWhichPointTouched];
}
- (void) touchesMoved
{
CGPoint thisPoint = [touches locationInView:self];
float xDiff = thisPoint.x - lastPoint.x;
float yDiff = thisPoint.y - lastPoint.y;
float constrain = [self getAspectRatio]; // e.g.. 8x10 = 8/10 = 0.8
if(ABS(xDiff) > ABS(yDiff)){
yDiff = xDiff * constrain;
if(activeHitPoint == 1 || activeHitPoint == 3)
yDiff *= -1;
}else {
xDiff = yDiff * constrain;
if(activeHitPoint == 1 || activeHitPoint == 3)
xDiff *= -1;
}
switch(activeHitPoint){
case 0:
if((hitPoints[0].x + xDiff > imageDisplayRect.origin.x) &&
(hitPoints[0].x + xDiff + kMinCropSize < hitPoints[1].x) &&
(hitPoints[0].y + yDiff > imageDisplayRect.origin.y) &&
(hitPoints[0].y + yDiff + kMinCropSize < hitPoints[2].y)){
hitPoints[0].x += xDiff;
hitPoints[0].y += yDiff;
hitPoints[3].x = hitPoints[0].x;
hitPoints[1].y = hitPoints[0].y;
}
break;
case 1:
if((hitPoints[1].x + xDiff < imageDisplayRect.origin.x + imageDisplayRect.size.width) &&
(hitPoints[1].x + xDiff - kMinCropSize > hitPoints[0].x) &&
(hitPoints[1].y + yDiff > imageDisplayRect.origin.y) &&
(hitPoints[1].y + yDiff + kMinCropSize < hitPoints[2].y)){
hitPoints[1].x += xDiff;
hitPoints[1].y += yDiff;
hitPoints[2].x = hitPoints[1].x;
hitPoints[0].y = hitPoints[1].y;
}
break;
case 2:
if((hitPoints[2].x + xDiff < imageDisplayRect.origin.x + imageDisplayRect.size.width) &&
(hitPoints[2].x + xDiff - kMinCropSize > hitPoints[0].x) &&
(hitPoints[2].y + yDiff < imageDisplayRect.origin.y + imageDisplayRect.size.height) &&
(hitPoints[2].y + yDiff - kMinCropSize > hitPoints[1].y)){
hitPoints[2].x += xDiff;
hitPoints[2].y += yDiff;
hitPoints[1].x = hitPoints[2].x;
hitPoints[3].y = hitPoints[2].y;
}
break;
case 3:
if((hitPoints[3].x + xDiff > imageDisplayRect.origin.x) &&
(hitPoints[3].x + xDiff + kMinCropSize < hitPoints[2].x) &&
(hitPoints[3].y + yDiff < imageDisplayRect.origin.y + imageDisplayRect.size.height) &&
(hitPoints[3].y + yDiff - kMinCropSize > hitPoints[0].y)){
hitPoints[3].x += xDiff;
hitPoints[3].y += yDiff;
hitPoints[0].x = hitPoints[3].x;
hitPoints[2].y = hitPoints[3].y;
}
break;
}
cropRect = CGRectMake(hitPoints[0].x, hitPoints[0].y, hitPoints[1].x - hitPoints[0].x, hitPoints[2].y - hitPoints[1].y);
lastTouch = thisTouch;
[self setNeedsDisplay];
}
This code almost works, but it's not very accurate...as if there's rounding errors. It loses the proper aspect ratio over time, which is surprising...and why I used floats...but it doesn't seem to matter.