1

I'm working on a Unity project, trying to test the UI interaction on the Quest II with hand tracking and ray casting. I've set up a very basic scene with all the default assets (OVRCameraRig, UIHelper) and added just a button to test the UI interaction. This is what my scene looks like:

enter image description here

The issue I have is, when I run the scene, the Ray is rotated 90 degrees, and it's attached to the wrist for some reason. I made a video here to show what I mean :

https://www.youtube.com/watch?v=5f12yfpugB8

It's still interacting with the UI though. Then after watching some online tutorials, I commented out these lines in the HandInputSelector.cs, which was attached to the UIHelper:

void SetActiveController(OVRInput.Controller c)
{
    /*
    Transform t;
    if(c == OVRInput.Controller.LTouch)
    {
        t = m_CameraRig.leftHandAnchor;
    }
    else
    {
        t = m_CameraRig.rightHandAnchor;
    }
    m_InputModule.rayTransform = t;
  */
}

and instead added a 2nd script to the UI helper, with these lines only:

public OVRHand hand;
public OVRInputModule inputModule;
 
private void Start()
{
    inputModule.rayTransform = hand.PointerPose;
}

Now the ray is at least attached to the correct position, but it still doesn't rotate properly with the hand movement. I made another video of it here : Now the ray is at least attached to the correct position, but it still doesn't rotate properly with the hand movement. I made another video of it here :

https://youtu.be/q3d0eG2LwY0

My Unity version is 2021.3.1f1

Can someone please tell me what I'm doing wrong?

KiynL
  • 4,097
  • 2
  • 16
  • 34
poorya79
  • 21
  • 1
  • 4

0 Answers0