I am developing an app for blind people that contains a map. When VoiceOver is turned off, I am able to detect the swipe and double tap gestures with the UIGestureRecognizer. For instance:
UISwipeGestureRecognizer * swipeLeft=[[UISwipeGestureRecognizer alloc]initWithTarget:self action:@selector(swipeLeft:)];
swipeLeft.direction=UISwipeGestureRecognizerDirectionLeft;
[self.view addGestureRecognizer:swipeLeft];
However, when VoiceOver is turned On I am not able to override the VoiceOver gestures for the ones that I have. I am aware of the possibility of UIAccessibilityTraits to allow direct interaction:
[mySubView setAccessibilityTraits: UIAccessibilityTraitAllowsDirectInteraction];
The problem is that since I have a map, when I allow direct interaction, I start interacting with the map (panning, zooming, etc). However, my goal is to keep the map as it is, but detect the gestures through the GestureRecognizers and do the actions that are linked to them.
Any ideas on how to do that?