I'm working on a VR project for HTC Vive Cosmos headset and have a huge problem with UI interactions.
- I was able to configute XR Rig with both controllers (left and right) which have an XR ray interactor.
- Canvas has Tracked Device Graphic Raycaster script and Graphic Raycaster
- EventSystem has XR UI Input Module
- There is an XR interaction Manager object in the scene with XR interaction manager script on it.
The project loads and I can see the rays representing both controllers, but when I try to interact with the UI button nothing happens.
What I tried:
- I of course read a documentation from Unity to XR Interaction Toolkit, especially UI setup section.
- I set raycast target off for all other UI elements that do not require interaction.
- I checked if the buttons have "interactable" on.
- I checked if there were some transparent UI elements that could be overlapping my buttons.
Nothing seems to be working and I got a feeling, the problem is not in UI elements, but in configuring the toolkit itself. Have you any ideas what can I check/change to make UI interactions work?
CodePudding user response:
I encountered a similar problem in one of my scenes.
It seems that I used a normal Canvas, instead of using the XR UI Canvas, which integrated an "InputSystemUIInputModule" component to the EventSystem, instead of the "XR UI Input Module" component.
I hope this will help you.
CodePudding user response:
The problem is solved and it was more an attention error. My controllers have an XR Ray Interactor and it has a property Max Raycast Distance (in Inspector -> XR Ray Interactor -> Raycast Configuration -> Max Raycast Distance). So basically my UI Elements were too far for interactor to detect them. I altered the Max Raycast Distance value and everything now works perfectly.