I have implemented a ZoomViewGroup
, which is capable of scrolling infinitely in all directions and also zooming infinitely while still delivering all touch events correctly offset to its child View
s.
But when it comes to multi-touch, only the first pointer is offset correctly, and all others are pointing to a wrong location, and thats because there is the scaling factor I have to take care of. (As pointer 0 has a different offset from its original location than pointer 1 or 2, when scaling_factor != 1.0f
)
I am saving my transformation in a Matrix
, so it's easy to calculate the coordinates from screen to workspace and back using matrix.mapPoints(..)
and the matrix inverse.
When I draw the child views, I can just apply the transformation matrix like this:
protected void dispatchDraw(Canvas canvas)
{
canvas.save();
canvas.concat(transformation_matrix);
super.dispatchDraw(canvas);
canvas.restore();
}
Same with touch events:
float[] touch_array = new float[2];
public boolean dispatchTouchEvent(MotionEvent event)
{
touch_array[0] = event.getX();
touch_array[1] = event.getY();
transformation_matrix.mapPoints(touch_array);
event.setLocation(touch_array[0], touch_array[1]);
return super.dispatchTouchEvent(event);
}
But MotionEvent.setLocation(float x, float y)
does actually offset all pointers by the same amount. If we zoomed by 2.0f the offset is different for each pointer, so I have to be able to do something like MotionEvent.setLoction(int pointer_index, float x, float y)
for each one individually. Is there anything I can do to achieve this?