0

I am currently writing a game for android. I have one normal GLSurfaceView where I am rendering my graphics and using the public boolean onTouchEvent(MotionEvent me) method to register touch inputs to for example turn the camera. In a separate class I programatically create a FrameLayout that contains the HUD. This is added on top of the GLSurfaceView with this.addContentView(FL_HUD, new FrameLayout.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT, ViewGroup.LayoutParams.MATCH_PARENT)). The HUD contains an Image View, that has a touch listener set with .setOnTouchListener, that creates another Motion event that I can read out.

The TouchListeners/MotionEvents both work, so when I touch my Image View, its TouchListener is called, when I touch somewhere else on the screen, the main Activity's TouchListener is called. But when I try to touch the Image View while touching the screen or try touching the ImageView while touching the screen it doesn't work or in other words, only one touchListener is active at a time.

How do I manage to enable multitouch over two overlaying Views?

1 Answers1

1

WHen you're touching the screen, all touches go to a single view. That means a second finger down will not go to the other touch handler, but will go to the same touch handler as the first touch. Your best option is to write a single touch handler for both views, and do the hit testing yourself.

Gabe Sechan
  • 90,003
  • 9
  • 87
  • 127
  • Okay, cheers! Does anyone have information on the values `MotionEvent.getX` / `MotionEvent.getY` / `MotionEvent.getRawX` / `MotionEvent.getRawY` return? They appear pretty random to me. – ProgrammingMachine5000 Feb 23 '15 at 10:38
  • 1
    getx and gety are pixel numbers relative to the upper left corner of the view. The raw values are the same but relative to the upper left of the screen. – Gabe Sechan Feb 23 '15 at 17:38