4

In my app I have a view I want to resize using a two fingers touch similar but not close to what the pinch gesture recognizer provides detection for.

The idea is similar to what you would do on the desktop by grabbing one of the four corners with the mouse, except that I want a more "touch friendly" interface, where the amount by which each such corner shrinks or grows are independent in both horizontal and vertical amounts: that's where I depart from pinch as the pinch's scale factor is the same for both X and Y, which is not what I want.

What I want is to detect two such fingers and resize/move the view as appriopriate.

And I have succeeded.

The idea I used (in addition to dealing with the half persistence of UITouch objects ...) was to deem the last moving finger as the "target", with the previous moving one as the "anchor".

I then compute a vector from anchor to target that points to one of the four corners, (it always does even when on an axis) allowing me to expand/shrink the width/height of the view at the same time as moving or not its origin, giving the effect that you can resize the top and left (origin change required) as well as the width/height only (origin left alone) or a combination of both.

To determine how much I need to shrink/grow/offset I use the difference between the current target point and the previous target point. In other words, the vector is used to determine which corner I am pointing to, and thus which "quadrant" the touch operates in, thus allowing me to chose which of x, y, width or height to alter, and the target current/previous position tell me by how much.

There are two problems, both of which I can live with, but I am wondering if anyone has gone the extra mile.

The user experience is great except for the slightly unnatural feeling which results from resizing the top right corner using a gesture where both fingers reside in the bottom left corner. This does exactly what the finger motion dictates, but feels a bit like the "spooky action at a distance". Maybe I just need to get used to it? I am failing to think of how to amend the gesture to achieve something more natural.

The math. Kind of ugly. I wanted to use an affine transform but failed to see how I could apply it to my problem, so I resorted to the old trig|ck of arcsine/arccosine, and then "switched" on the vector direction to determine which "quadrant"(of some hypothetical unit circle, only related to the relative position of anchor and target, irrespective of where they are in the view, -- hence problem#1 --).

so the questions summary:

  • Is there a better, user friendlier approach that would make the drag/resize effect more consistent with where the fingers are within the view?
  • Would an affine transform make the code cleaner? how?

The code.

A: wrapping UITouches

@interface UITouchWrapper : NSObject
@property (assign, nonatomic) CGPoint   centerOffset ;
@property (assign, nonatomic) UITouch * touch ;
@end
@implementation UITouchWrapper
@synthesize centerOffset ;
@synthesize touch ;
- (void) dealloc {
    ::NSLog(@"letting go of %@", self.touch) ;
}
@end

B. UITouch handling

@property (strong, nonatomic) NSMutableArray *              touchesWrapper ;

@synthesize touchesWrapper ;

- (UITouchWrapper *) wrapperForTouch: (UITouch *) touch {
    for (UITouchWrapper * w in self.touchesWrapper) {
        if (w.touch == touch) {
            return w ;
        }
    }

    UITouchWrapper * w = [[UITouchWrapper alloc] init] ;
    w.touch = touch ;
    [self.touchesWrapper addObject:w] ;
    return w ;
}

- (void) releaseWrapper: (UITouchWrapper *) wrapper {
    [self.touchesWrapper removeObject:wrapper] ;
}

- (NSUInteger) wrapperCount {
    return [self.touchesWrapper count] ;
}

C: touch began

- (void) touchesBegan:(NSSet *) touches withEvent:(UIEvent *)event {
    // prime (possibly) our touch references. Touch events are unrelated ...
    for (UITouch * touch in [touches allObjects]) {
        // created on the fly if required
        UITouchWrapper * w = [self wrapperForTouch:touch] ;
        CGPoint p = [touch locationInView:[self superview]] ;
        p.x -= self.center.x ;
        p.y -= self.center.y ;
        w.centerOffset = p ;
    }
}

D: finding 'the other' point (anchor)

- (UITouch *) anchorTouchFor: (UITouch *) touch {

    NSTimeInterval mostRecent = 0.0f ;
    UITouch * anchor = nil ;

    for (UITouchWrapper * w in touchesWrapper) {
        if (w.touch == touch) {
            continue ;
        }

        if (mostRecent < w.touch.timestamp) {
            mostRecent = w.touch.timestamp ;
            anchor = w.touch ;
        }
    }

    return anchor ;
}

E: detecting a drag (= single touch move)

- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *) event {

    CGRect frame = self.frame ;

    for (UITouch * touch in [touches allObjects]) {
        UITouchWrapper * w = [self wrapperForTouch:touch] ;
        if ([self wrapperCount] == 1) {
            // that's a drag. w.touch and touch MUST agree
            CGPoint movePoint = [touch locationInView:[self superview]] ;

            CGPoint center = self.center ;
            center.x = movePoint.x - w.centerOffset.x ;
            center.y = movePoint.y - w.centerOffset.y ;
            self.center = center ;

            CGPoint p = movePoint ;
            p.x -= self.center.x ;
            p.y -= self.center.y ;
            w.centerOffset = p ;

            [self setNeedsDisplay] ;
// ...
       }
    }
}

F: computing the angle [0 .. 2 pi] of the vector anchor:touch

- (float) angleBetween: (UITouch *) anchor andTouch: (UITouch *) touch {
    // the coordinate sysem is flipped along the Y-axis...

    CGPoint a = [anchor locationInView:[self superview]] ;
    CGPoint t = [touch locationInView:[self superview]] ;

    // swap a and t to compensate for the flipped coordinate system;
    CGPoint d = CGPointMake(t.x-a.x, a.y-t.y) ;
    float distance = sqrtf(d.x * d.x + d.y * d.y) ;
    float cosa = (t.x - a.x) / distance ;
    float sina = (a.y - t.y) / distance ;

    float rc = ::acosf(cosa) ;
    float rs = ::asinf(sina) ;

    return rs >= 0.0f ? rc : (2.0f * M_PI) - rc ;
}

G: handling the resize:

- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *) event {

    CGRect frame = self.frame ;
    // ...

    // That's a resize. We need to determine the direction of the
    // move. It is given by the vector made of this touch and the other 
    // touch. But if we have more than 2 touches, we use the one whose
    // time stamp is closest to this touch.

    UITouch * anchor = [self anchorTouchFor:touch] ;

    // don't do anything if we cannot find an anchor
    if (anchor == nil)  return ;

    CGPoint oldLoc = [touch previousLocationInView:[self superview]] ;
    CGPoint newLoc = [touch locationInView:[self superview]] ;

    CGPoint p = newLoc ;
    p.x -= self.center.x ;
    p.y -= self.center.y ;
    w.centerOffset = p ;

    CGFloat dx = newLoc.x - oldLoc.x ;
    CGFloat dy = newLoc.y - oldLoc.y ;

    float angle = [self angleBetween:anchor andTouch:touch] ;

    if (angle >= M_PI + M_PI_2) {   // 270 .. 360 bottom right
        frame.size.width += dx ;                
        frame.size.height += dy ;
    } else if (angle >= M_PI) {     // 180 .. 270 bottom left
        frame.size.width -= dx ;                
        frame.size.height += dy ;
        frame.origin.x += dx ;
    } else if (angle >= M_PI_2) {   //  90 .. 180 top left
        frame.size.width -= dx ;
        frame.origin.x += dx ;
        frame.size.height -= dy ;
        frame.origin.y += dy ;
    } else {                        //   0 ..  90 top right
        frame.size.width += dx ;
        frame.size.height -= dy ;
        frame.origin.y += dy ;
    }
    // ...
    self.frame = frame ;
    [self setNeedsLayout] ;
    [self setNeedsDisplay] ;

H: cleanup on touchesEnded/touchesCancelled

for (UITouch * touch in [touches allObjects]) {
    UITouchWrapper * w = [self wrapperForTouch:touch] ;
    if (w.touch == touch) {
        [self releaseWrapper:w] ;
    }
}
verec
  • 5,224
  • 5
  • 33
  • 40

0 Answers0