0

I am developing an app for blind people that contains a map. When VoiceOver is turned off, I am able to detect the swipe and double tap gestures with the UIGestureRecognizer. For instance:

UISwipeGestureRecognizer * swipeLeft=[[UISwipeGestureRecognizer alloc]initWithTarget:self action:@selector(swipeLeft:)];
swipeLeft.direction=UISwipeGestureRecognizerDirectionLeft;
[self.view addGestureRecognizer:swipeLeft];

However, when VoiceOver is turned On I am not able to override the VoiceOver gestures for the ones that I have. I am aware of the possibility of UIAccessibilityTraits to allow direct interaction:

[mySubView setAccessibilityTraits: UIAccessibilityTraitAllowsDirectInteraction];

The problem is that since I have a map, when I allow direct interaction, I start interacting with the map (panning, zooming, etc). However, my goal is to keep the map as it is, but detect the gestures through the GestureRecognizers and do the actions that are linked to them.

Any ideas on how to do that?

Dan Morrow
  • 4,433
  • 2
  • 31
  • 45

2 Answers2

2

VoiceOver's direct interaction model supports gesture recognizers. What you're observing is a conflict with the map's gesture handling. Given the complexity of the map view and its touch handling, I'd encourage taking one of two alternative approaches. In both cases, you'll likely want to overlay a transparent UIView atop the map view.

  1. Attach any gesture recognizers to this custom view. Users may trigger shortcuts via direct interaction. You may want to condition this on VoiceOver running.
  2. Sidestep direct interaction entirely and implement your shortcuts as custom actions on the map or overlay view. This will likely benefit users of other accessibility features, not just VoiceOver.
Justin
  • 20,509
  • 6
  • 47
  • 58
0

yes, I resolved this issue attaching a subview and setting ".isAccessibilityElement = true" and ".accessibilityTraits = .allowsDirectInteraction" and referring all gestures and override touch methods to the view added.

Lorenzo
  • 1
  • 2