I'm writing my own map annotation layer which allows annotations to stick to a side of the screen (indicating something off screen). It works well. I can add/removre annotations. Rotate the map -> the annotations follow. Pinch to zoom -> the annotations stick to the side of the screen. They also clustuer and uncluster as you zoom in and out.
The problem is when I peform a pinch gesture (to zoom in, zoom, out, or rotate), both fingers must not be on a MYView (an annotation).
- MYViewController
- MKMapView
- MyAnnotationsView
- NSArray of MYView (one for each annotation)
In order for the gestures to work at all but still allow for selecting of annotations, I have over-riddern hit test in MyAnnotationsView.
-(UIView*)hitTest:(CGPoint)point withEvent:(UIEvent *)event{
for(UIView *subview in self.subviews.reverseObjectEnumerator){
if(CGRectContainsPoint(subview.frame, point)){
return subview;
}
}
return nil;
}
Then I am catching the touch event like so in MyView
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
[self.delegate aggregateViewTouchesEnded:self];
}
Which tells MYAnntationsView that a touch occurred on an annotation. It can then tell the ViewController.
TLDR; When I pinch/rotate, both fingers must not be on an annotation.