3

I'm working on an app using AirPlay and I need a way to have touches on the main screen act as touches on the UI on my external screen in order to be compatible with a large number of previously existing custom UI elements. Rebuilding the UI elements would be orders of magnitude more difficult than finding a way of translating the touches from one view to another.

The external screen will feature a sort of mouse pointer to represent the interaction, as the user will need a point of reference on the screen for their actions. This may create User Interface guidelines hurdles, but I'll cross that mountain when I get to it. Hopefully I can find a way to make this item sufficiently non-mouse like. The user will interact with the device as a sort of track-pad.

I'm using a window on the device screen with "sendEvent" subclassed to catch the touch events. I'm attempting to manually walk the view hierarchy for the views that need to receive input. Finding the view I want to talk to is not difficult. For UIControl based classes I can call "sendActionsForControlEvents" to send the appropriate messages for the control. This may need some caressing, but for now that's not the main issue.

For the UI events for touchesBegan, touchesMoved, etc. I don't have a decent way of faking the UIEvent information. I can't call these functions unless I have a UIEvent, and I don't seem to have any way of create a UIEvent object. The UIEvent from sendEvent does not have a position that matches the pointer position on the secondary screen (at the least), so simply passing it on will not give me what I want.

Is there any legitimate way of synthesizing this information?

Kyle Olson
  • 116
  • 6
  • Using Google iOS NativeDriver you should be able to create UITouch / UIEvent http://code.google.com/p/nativedriver/source/browse/trunk/iphone/ThirdParty/TouchSynthesis/TouchSynthesis.h?r=58 thanks – ebtokyo Apr 12 '12 at 01:10

0 Answers0