I’m using the Oculus Rift for a project, so to test some simple things, I just made a canvas in Screen Overlay mode and added a button on top, leaving all the defaults (except for changing the highlight color to something more noticeable than just white). I can hover over it just fine.
However, when I change the canvas to World Space (which I need to do for the VR controls I’m adding), the button suddenly has an offset when I put the camera over it? Here’s a visual example, first with the overlay (apologies for the crappy pictures, I wasn’t able to get the cursor in my screenshots so for the sake of not wasting any more time…):
And then here’s with a world space canvas:
I had a third picture showing that pressing the very bottom of the button WILL cause it to highlight, but apparently only a max of 2 pictures are allowed. Point is, the clickable region of the button has shifted, and I think moving my head around with the VR headset will also change the clickable region. I have no idea where this region even is, though, because there’s no component that has some visible box outline in editor mode that I can point to and say, “aha! this is where the clickable region is!” Whatever outlines I see (the text box, the panels, etc) don’t match up with the area that is clickable.
Am I making a silly mistake somewhere? Or is it a known problem with VR?
When you switch a Canvas to World Space Render Mode, an Event Camera reference field pops up and it’s null by default. And unity doesn’t complain about it nor does it automatically use the main camera, even on a scene with nothing but one camera and one canvas + eventsystem + button.
So drag your GUI / Main camera into that field. However, when I tried to do this, in two different projects, the World Space mode still didn’t work (was offset somewhere to hell). I had to delete my GUI and recreate it from scratch with World Space enabled from the get go and checking every step of the way if it still works.
And if you’re trying to have a UGUI that works for both the mouse and VR controllers at the same time, you’re gonna have a bad time, as you can only send UI events from an EventSystem, which can only be cast from a 2D screen. So you’ll have to mount a fake Camera on the VR controller when the trigger is pressed, and change the Canvas’s Event Camera ref (and for me at least that still doesn’t work).
Glad to know I am not the only one with this problem - Its an old post but if I work it out I will post an update
EDIT - Try finding the EventSystem Object (should automatically be made) then disable the “Standalone Input Module” which seems to conflict with my demos laser pointer controllers.