I'm trying to calculate the coordinates of a zoomed in image based on the pageX and pageY coordinates given to me by a PanResponder. I figured I would just have to add the offset of the X and Y to the coordinates given back to me but I seem to have misjudge what would have to happen.
The coordinates of the device screen regardless of scale seem to correlate to the same location on the zoomed element. So if I place something in the top left of the device screen it goes to that location on the zoomed in element.
I get my offset by measuring the zoomed element with RCTUIManager and it seems to give back the correct values for the X and Y offset based on other calculations I have tested.
This is the most recent iteration of the code I have attempted to use:
let {measureData} = this.state;
let offsetX = measureData.x;
let offsetY = measureData.y
let x = (e.nativeEvent.pageX) - offsetX;
let y = (e.nativeEvent.pageY) - offsetY;
I'm subtracting because the values I receive are negative, I also will allow them to pan this later so I had hoped it would work correctly if they moved it so the value was positive to subtract it.
measureData
contains the values from the RCTUIManager which gets updated every time a zoom event occurs.
How I update measureData
setMeasureData = () => {
RCTUIManager.measure(ReactNative.findNodeHandle(this.refs['wrap']), (x, y, width, height, pageX, pageY) => {
this.setState({
measureData: {x, y, width, height, pageX, pageY}
})
});
}
This is called in a callback function after the zoom is changed.
The elements I'm trying to add are added using another component I feed the X Y values into, this component uses Animated.ValueXY()
with the calculated coordinates to place the image in the correct starting location for them to be able to drag it after.
The current calculation makes anything placed go down and right of the touch event. I'm not sure where my logic or calculations are wrong with this. I'm hoping someone may have some insight into where my calculation or thought process is wrong.