Correct setup for Unity UI

Sorry for the late reply.

The short story is that you need to back your input camera (in the case of this project, the main camera) so that all the UI elements are visible to it. That will fix the issue.

The longer story has to do with the fact that we need a camera to do raycast input with, but the fact that things aren’t visible to the camera are not at all reflected in what is visible to you in the actual world that you have setup. This is complicated by the fact that we have no idea what things like camera dimensions or ‘screen size’ are in visionOS so can’t really compensate directly for them. We are looking into ways to get around this but for now this should fix the situation.

Hey @joejo ! Can you explain some more the where to position a Canvases EventCamera so it is able ability to recognize click events?

We’re still struggling to get our game’s UI to react to clicks. They do get the hover effect but they don’t react if they’re clicked, neither in the simulator nor on device. So I looked at it again in a simple project based on the UnityUI project you sent. But I have a difficult time trying to understand where the EventCamera has to be positioned.

In my experiments it works if the EventCamera is positioned at the edge of the VolumeCamera. E.g. when the VolumeCamera sits at (0,0,0) and has Dimensions set to (1,1,1) (with a Canvas and UI Button inside this Volume) and I position the EventCamera at (0,0,-0.5) it works as expected. When I move it farther away it gradually stops working and if I move it along the y-axis only part of the button work, which is kinda expected, but I don’t fully understand how they’re related or where exactly I should position this EventCamera in relation to the Canvas in the Volume.
Then I even removed the camera from Canvas.EventCamera and the button still reacted to clicks which doesn’t make sense to me anymore. If there’s no EventCamera set, does Unity use the VolumeCamera as a default for raycasting the UI elements or something?

Here’s the example project I worked with. It will work if you start it as Canvas has no EventCamera set. Once you set the camera called UICamera as the Canvases EventCamera the button doesn’t work anymore. If you move the camera to (0,0,-0.5) it works the same way as when it’s not set in the Canvas…

I would like to understand the relationship of this, so I can better figure out why it doesn’t work in our full project. Any help or explanations would be appreciated! Thanks!

1 Like

In my testing today I’m pretty certain I found that Canvas only uses the camera that is tagged as ‘MainCamera’ for raycasting / registering events and that it doesn’t matter what camera is set to Canvas.EventCamera. This is not how it should be, right?
This means that the camera that is tagged with ‘MainCamera’ has to see the UI otherwise the UI won’t react to clicks

No, that is not how it should be. If you look at where it sets the camera, it should set it as the camera on the event system. But the raycast is only part of the equation as there is also the SpatialPointerEventListener and the camera that it uses. This is part of the fix that we are putting out in the next release.

As far as the relationship, the raycaster should use the EventSystem camera, as you noted, and that camera will only raycast against things that are within the view frustum that is visible to it. So if the UGUI elements are outside the bounds of what your camera can see then, regardless of what you can see visually, hit testing will fail.

If you think there is an issue with it using the wrong camera, you can always try to create your own raycaster derived from GraphicRaycaster and add that to the Canvases that you want to use/test with.

Something like this may work for you:

    class MyTestRaycaster : GraphicRaycaster
    {
        public Camera MyCamera;

        public override Camera eventCamera
        {
            get
            {
                if (MyCamera != null)
                     return MyCamera;

                return base.eventCamera;
            }
        }
    }

Thanks for the info. We have worked around this issue by tagging the camera, that always has the UI in its frustrum, with tag ‘MainCamera’.
But if I understand you correctly than the issue that Canvas.EventCamera is ignored and that the system always uses the MainCamera for raycasting will be fixed in the next release? Good to know!

And is the SpatialPointerEventListener responsible for doing the hover effect on visionOS? Because that always worked on our UI element, but it never registered the Click events.

Hi @joejo, we have updated to PolySpatial 0.7.1 and unfortunately this breaks our UI setup again. In 0.6.3 our UI worked when we set our UI Camera, which is set as the EventCamera of our Canvas, to tag ‘MainCamera’. But in 0.7.1 this doesn’t work anymore, nor does setting this camera as the EventCamera of our Canvas.
I again compared our test projects to your UnityUI project you shared in here, but couldn’t find out what the differences are. :frowning:

I have filed a bug with a TestProject attached. CASE IN-64089
In the TestProject the button only reacts to click when it’s set to (0,0,0) and not otherwise.

Whishing you happy holidays and looking forward to your response once you have time to look at this! :christmas_tree:
Best,
Felix

I’m seeing a similar issue with 0.7.1. A setup that works in 0.6.3 does not in 0.7.1. Inputs only seem to work if the UI is within 1 unit from the origin or if the volume camera is set to bounded.

I think I found a good workaround. Seems like syncing the transform of VolumeCamera.BackingCamera with the main camera works.

Thanks for the tip. I haven’t seen VolumeCamera.BackingCamera before, good to know it exists! Sadly your workaround doesn’t work for my test scene :frowning: Not sure I understood you correctly, but here’s what I tried in a quick test:

  • We have a camera called “UI Camera” that is set as the EventCamera of our Canvas.
  • On this camera I have a component that sets the position and rotation of this “UI Camera” to the position and rotation of the BackingCamera. Like this:
void Update()
    {
        volumeCamera.BackingCamera.transform.GetPositionAndRotation(out Vector3 vCamPos, out Quaternion vCamRot);
        transform.SetPositionAndRotation(vCamPos, vCamRot);
    }

This did not help in our test project. Is that (kinda) what you’re doing for your workaround? Do you also have a camera set as Canvas.EventCamera? Is that camera also tagged as ‘MainCamera’?

In addition to syncing the position and rotation, I’m also doing the same thing for scale due to how some of the objects are set up in my scene. I’m not sure if you’ll also need to do that.

Canvas.EventCamera isn’t set for me and I’m using the camera at Camera.main to set up the backing camera. I’m basically only using a single camera (tagged as MainCamera) for the project.

Doing the above works both in the 0.7.1 template project (with additional objects for UGUI) and the project I’m working on.

Hey, I was running into the same problem, and wanted to try the solution you posted. It seemed backwards to me, and when I ran the following code, it seemed to resolve my UI input issues

public void Update()
{
    if (_mainCamera.transform.hasChanged)
    {
        _mainCamera.transform.GetPositionAndRotation(out Vector3 pos, out Quaternion rot);
        _volumeCamera.BackingCamera.transform.SetPositionAndRotation(pos, rot);
    }
}

_mainCamera and _volumeCamera are [SerializedField]s I assigned in my main scene.

Hope that helps :smiley:

Good catch, yeah looks like @mml was setting the wrong camera’s transform.

Sorry for the long delay, I was out of office till yesterday.

I looked into the repro case you added to IN-64089 and while I could repro the issue in that project, there were a lot of issues I was having with that project that forced me to stop using it (pink/bad textures, crash on start in editor if I touched anything in the hierarchy).

So, I created a new instance of the repro scene, using your button handler script, using the 0.7.1 template. The hierarchy and setup SHOULD be the same. The setup worked just fine. So, I added another button and put them both at opposite corners so that I can toggle the image. This also works.

The zip of the project can be found here.

Note that this doesn’t resolve the issue of the buttons outside of the 1 unit from center. That issue has yet to be looked at. I’m not sure if there is a an incident filed for that specific issue (and your workaround here with the camera → vol cam backing camera) but if you could file one that would be helpful.

@AdamSanche thanks for the info! Unfortunately swapping the cameras doesn’t change anything in my case….

1 Like

Hi @joejo ! Welcome back, I hope you had relaxing holidays!
I also had pink textures in editor, but I assumed this was because of the Built-In render pipeline, which we have to use for our project. To be honest I’m testing mostly in Simulator, because I can’t rely on rendering in the editor.
When I switch the project you uploaded to Built-in render pipeline the clicks don’t work anymore. If I remember correctly I also tried my test project with URP before christmas and it worked. So maybe the problem is using Built-in render pipeline?
Unfortunatley there’s no time I can switch our project to URP for visionOS and as we’re trying to still make a build for the launch, we’re very much under time pressure here. Our fallback for now is to make a build without Unity UI and re-create the most necessary UI for our game in Swift…

Ah, I was confuse as there were URP assets in there and it looked like URP was hooked up. Regardless… hit testing and URP/BIRP should have no direct interaction in any way so the fact that you are seeing this is… surprising.

To clarify, all you did was remove URP from Graphics and remove the URP package(s)?

Yes, all I did was remove UniversalRenderPipelineAsset from Graphics settings as well as UniversalRenderPipelineGlobalSettings from the URP Global Settings. I did not remove these assets from the project. (That’s probably also why they were still in the my own project, because I remember switching it to URP and then click worked in the simulator :man_shrugging:). After removing I did a new simulator build and the clicks didn’t work in simulator.
I’m running on Unity 2022.3.15f1, PolySpatial 0.7.1 and Xcode 15.2 beta with VisionOS Simulator 21N305.

Yep, reproing locally now as well and looking into what the issue is. Thanks.

For your case, can you try disabling the Apple Vision OS loader in Player Settings → XR Management and test this again?

Hi joejo! I disabled the Apple visionOS Plugin Provider and it fixed my test project. Which means I could click on the button even when it’s not positioned at (0,0,0) and also without syncing the UICameras position to the one of the backing camera of the VolumeCamera (or viceversa). So: yeah! But: Is it safe to just disable that plugin provider/loader? What consequences does it have for the project? I’m guessing we can’t use XR Hands System without this for example…
I did not yet try it in our main project. Our main project is now in a state where i can’t quickly try it out, because we’re trying to ship a build asap in the next two weeks where I removed all Canvases and Unity UI elements. Not sure I can try it out in our full prject before we shipped this…