XR Ray Interactor getting "trapped" inside the UI canvas

I have followed the tutorial

Overall, the ray interaction seems working but with one strange behaviour. For the records, I’m displaying a simple Canvas with one button on it.
When I use the XR Ray Interactor component on the right controller and I point at the canvas, I can select buttons and move the ray inside the canvas but then I cannot leave the limit of the canvas anymore. They ray looks like “trapped” inside the canvas. Sometimes the ray gets detached by the canvas but when I point it around at some point it snaps to the canvas again.

The problem happens as soon as I activate the checkbox “Enable Interaction with UI GameObject” in the XR Ray interactor

If you check the screenshot, you will notice the hand is pointing outside the canvas but the ray snaps towards it.

What could be the problem?


5890916--628034--Screenshot 2020-05-24 at 02.38.35.png

Hello, i have the same issue

It’s probably a bug, I suggest to open a support ticket linking this page. I have done the same, maybe they will notice it

Hello i found a fix for this (GitHub - Unity-Technologies/XR-Interaction-Toolkit-Examples: This repository contains various examples to use with the XR Interaction Toolkit). This github has a scene called Canvas UI that shows it working. You can use that one. I needed to downgrade to 0.9.3.

Thanks a lot, I will give it a try and let you know :slight_smile: Out of curiosity, was our configuration very different from the right one?

Ok that project work even with 0.9.4 but when I try to reproduce the exact same configuration on my project it doesn’t work. The only difference I found was this “Target Display” option which doesn’t appear in my Camera configuration.

Not sure why Unity is ignoring this post.

5922869--632912--Screenshot 2020-05-31 at 17.49.15.png 5922869--632915--Screenshot 2020-05-31 at 17.44.43.png

Ok found the problem: it’s the Camera component.
You just need to copy the camera component inside the XRRig from the UICanvas demo scene into your project and it will work. No need to downgrade to 0.9.3.

I hope Unity will publish a fix in the next release

Thanks a lot, without your help it would have been impossible to catch it!! :slight_smile:

If you create your camera properly, this problem never happens. That’s probably why it’s being ignored - are all of you following that tutorial? I’ve never seen this problem myself, on many projects.

There could be a bug in the tutorial. Most youtube tutorials I’ve seen are unreliable: they don’t get updated when the API changes or when Unity releases patches … so they tend to get broken quickly, and I generally ignore them.

Hey there bois. Had the same problem adding the “MainCamera” tag to the doing the world space thingy fixes the issue. Shoutout to my boys at Unity for whatever they did to make it happen.

Also a43 your post is so insubstantial, nobody cares that you never have this problem please remove it.

3 Likes

You shouldn’t have a project without a MainCamera tag. Unity creates it for you every time you make a new Scene (and you can’t disable this without a lot of effort) which suggests that you - at some point - deleted it. That’s a bad idea, since Unity has pretty much always required it (at least since 2012).

1 Like

Yes it fixed the weird “snapping” problem but now I can’t click on the button anymore. It’s just hovering and selecting but the OnClick event isn’t get fired. Changing back the Camera tag to “Untagged” makes me fire the OnClick event but in exchange for this snapping problem with the ray.

Is there a solution for this yet?

What’s the problem? So far this thread appears to have been:

  1. Follow a broken tutorial
  2. That creates an invalid Unity scene
  3. … which doesn’t work
  4. But everyone that does NOT follow this tutorial, and writes normal Unity scenes, following the official Unity docs … it works 100% of the time.

There’s some problems where if you try to convert an existing scene to XR/VR then some of the auto-convert code from Unity is incorrect and will break bits of your scene (for instance: it thinks that all RenderTexture cameras are actually VR cameras, which is really stupid, and impossible). For those cases you can either read the docs and look at your Camera objects and manually set the values to what they should be for your game – or you can delete your camera + rig and use the one-click “Add Rig” feature in XRToolkit that will create a fully correct camera + VR settings for you.

I ran into this as well and it was a main camera issues. WHY didn’t I have a main camera ? Because when your working on multiple platforms it becomes meaningless. I have different cameras for each XR rig, for flat, different cameras for switch vs xbox vs ps. All with different components/configurations.

My last project was a cockpit game and that was even worse with 6 cameras for each platform all rendering different parts of the scene.

It wasn’t apparent that it had to be setup as “main camera”. Really it should be assignable in the component, that would have made it clear and be more scale able.

1 Like

I had a similar problem, what found I was missing from my UI canvas was another (new) tracking script that is in the UI demo mentioned by @XLRLimits . ‘Tacked Device Graphic Raycaster’. No need to downgrade, works with the current XRI Toolkit preview package 2.0.0-pre.6

1 Like

thanks for the spam

Please don’t.

2 Likes

I’ve thrown it out of the airlock so I doubt it will return.

1 Like

I spent the day trying all these approaches - and others - but, in the end of the day, it was totally my mistake:
I was using a canvas with a button which had lots of scripts attached, but one of them was duplicated (the same script was attached twice and two different actions were performed using the same script. Everything on the same button).

None of the above solutions worked. For some reason when i add the main camera tag to my VR Rig camera, the screen is black in game.