I have a Unity WebGL player that is embedded in a React application. The React application has drag and drop tiles that can be drug and dropped onto the WebGL player. When the tiles begin to be drug unity starts raycasting so that you can tell what object in the screen you are going to drop onto. All of this works perfectly when using a mouse, but I've noticed Input.touchCount
always returns 0 unless the touch originates inside the WebGL player. Does anyone know a fix to this? Been head bashing this one for a moment now...
Here is the raycasting code. Like I said, it works perfectly for a mouse... but I cannot get a touch.position
returned.
public void LateUpdate()
{
if (SHOULD_CAST_RAY)
{
// always returning 0
Debug.Log(Input.touchCount);
RaycastHit hit;
Vector3 position = Input.touchSupported
&& Input.touchCount == 1
? new Vector3(Input.GetTouch(0).position.x, Input.GetTouch(0).position.y, 0)
: Input.mousePosition;
if (Physics.Raycast(RigsCamera.ScreenPointToRay(position), out hit, CameraRigControllerScript.CameraDistanceMax * 1.5f, 1 << 10))
{
if (CURRENT_SELECTION == null)
{
CURRENT_SELECTION = UnsafeGetModelInstantiationFromRaycast(hit);
ApplySelectionIndication();
}
else if (!IsAlreadySelected(hit))
{
RemoveSelectionIndication();
CURRENT_SELECTION = UnsafeGetModelInstantiationFromRaycast(hit);
ApplySelectionIndication();
}
return;
}
if (CURRENT_SELECTION != null)
{
RemoveSelectionIndication();
CURRENT_SELECTION = null;
}
}
}
Also, if I touch the screen on the Unity WebGL player and then start dragging one of my React components which sends a message to Unity to start raycasting; I get atouch.position that is at the point I touched and does not move with the finger... The hell?