I'm trying to make a VR game on Oculus Quest with the XR Interaction Toolkit and I would like to configure an haptic event when the user is hovering a UI element. The problem is that the haptic event works when I'm hovering a XRGrabInteractable or a TeleportationArea with the XRRayInteractor but not on a canvas. Even the simple hover event (OnHoverUI on the picture) isn't working on the canvas. It's really weird because I can interact with the UI elements (buttons, sliders, ...).
Here are the parameters of my XRRayInteractor : XRRayInteractor parameters
I already found a solution working without those parameters but it's really annoying. First, I needed to access the controllers (right and left hands). Then, I added a "Event Trigger" component to every single UI element on which I want a haptic feedback. Those triggers have a "Pointer Enter" event calling the "OnHoverUI" function.
void Start()
{
var inputDevices = new List<InputDevice>();
InputDevices.GetDevices(inputDevices);
foreach (var device in inputDevices)
{
if (device.role.ToString() == "LeftHanded")
{
leftHand = device;
}
else if (device.role.ToString() == "RightHanded")
{
rightHand = device;
}
}
}
public void OnHoverUI()
{
if (leftMenuMode) leftHand.SendHapticImpulse(0, 1f, 0.01f);
else if (rightMenuMode) rightHand.SendHapticImpulse(0, 1f, 0.01f);
}
I really would like to use the parameters of the XRRayInteractor but they don't work with the UI.
Does someone have an idea why ?