Camera not moving and no hover effects

Hi! I am creating an immersive app but some basic things are not working. First, I added an XR Rig but the camera doesn’t move at all with head movements. Is there something else I need to do to enable 6dof?

Also, I have a sprite and a cube with PolySpatialHoverEffect, but nothing happens when I point cursor at either object in simulator. It worked when I was working in shared space mode. Is there something that I need to do to enable this in immersive mode?

Finally, I have polyspatial turned off in settings, which could be contributing to the problem. But if I turn it on while keeping immersive mode, the app builds but crashes immediately in runtime with this error:

[AssetManager] Could not find shader Universal Render Pipeline/Unlit.
dyld[13584]: missing symbol called

Using Xcode 15.1 beta 1, Unity 2022.3.11, VP sim 21N5259j.

Thanks!

1 Like

Hi,

Just as clarification, is this a VR application (fully immersive) or a mixed reality application? If it is a VR application, try adding ARSession to the scene and ensure TrackedPoseDriver’s input binding contains references to devicePosition[AR Handheld Device] and deviceRotation[AR Handheld Device].

For PolySpatialHoverEffect, that will not work with fully immersive applications. The PolySpatial features work with shared space/mixed reality applications, and not with fully immersive.

1 Like

Thank you so much for your help! Adding ARSession fixed the camera problem.

The app is indeed VR. Would I still be able to use look and tap to select UI buttons? If not, what’s the best practice for how to select UI elements?

Also, I get spammed with these messages seemingly every frame in Xcode when app is running in the simulator: “Presenting a drawable without a device anchor. This drawable won’t be presented.”

This is a miss on our part. Gaze input was broken/removed prior to or initial beta release. It should be available in the next release of com.unity.xr.visionos.

That is expected. It’s a little annoying but you should be able to filter the log view to stdio to see just Unity messages.

1 Like

In the meantime, you can try using hand skeletal tracking to detect a pinch, and then use device head pose to figure out which direction the user is facing. With those two, should be able to do basic UI interaction. There is an example of using skeleton hand data to detect a pinch in the MixedReality scene in the samples - the general logic should be the same for fully immersive.

So eventually, we will be able to do the same kind of gaze and pinch with the built in hover effect in VR as in Mixed Reality? Or is it not possible in the privacy preserving way that Apple wants?

Yes. In the next release you will be able to use gaze/pinch as an input mechanism in VR. When the system detects a pinch, you will get the following information through an input system (package) device:

  • The “device pose” representing a position and rotation where your thumb and index finger meet. This updates throughout the gesture as you move your hand.
  • The origin and direction of the initial gaze vector when the pinch first occurred. This retains the same value throughout the gesture.

The device will be very similar to the PolySpatialTouchSpace device (since renamed to SpatialPointerDevice described here. The main difference is that there will be no interactionLocation or targetId information, since there are no RealityKit objects in VR.

@mtschoen Any updates on this? When should we expect the next release to be available? We’re struggling to get interaction to work in a fully immersive app…

When using the XRHands package, is it correct that there are no XR.InputDevice’s showing up? All I can find is XRHands from the new Input System, but no devices from InputDevices.GetDevices().

If you want a working example of how to access hand tracking data using the XRHands package for a fully immersive app, see this test project I posted a couple weeks back:

It hasn’t been updated for the latest 0.4.3 update, and I can’t promise the most elegant code, but it’s something you can look at if you are wondering what pieces you need to get things working on device.

Thanks, @puddle_mike! It’s something along those lines I have. I can get a reference to the XRHandSubsystem and read hand data from it, but I had expected that a XR.InputDevice was showing up and could be requested by for instance InputDevices.GetDevicesWithCharacteristics. But it doesn’t.

Very soon! We’re doing our best to get it out ASAP. :slight_smile:

@puddle_mike, thanks as always for helping out your fellow users! We’re still catching up on incorporating VR mode into our sample content, but it’s great to have examples like this in the meantime.

That is expected. Assuming we’re talking about the same thing, the “Feature API” including XR.InputDevice are considered “legacy” and no longer supported by newer XR integrations like visionOS and OpenXR. The XR Hands package is the way to get hands data going forward, and the Input System package will be the way to hook into gaze/pinch input when it is ready.

1 Like

Ok, thanks for the clarity. Looking forward to the release then. :+1::grinning:

Ya, I got that warning message too. Nice to know it’s not a problem.

Hello, I am new in Unity. I have a XR Rig Camera in my VR scene. This camera does not move at all. I tried the solution (AR Session) as well but nothing changed. Can you explain more detail how you did it? I even tried the VR template of unity but no result again. My headset is Varjo XR-3. It would be highly appreciated if you could help me.

This is a vision pro forum