I have a VR application updated to support AVP and it runs and renders in the simulator. When rebuilding the same project targeting the device and attempting to run it on a kit nothing renders. As a test, I downloaded the Unity VR Core project, added AVP support, and found that the same issue occurs. In both cases, the application launches on device and the Loading display is shown but the user seems to never be put in “immersive mode” and can still see the passthrough even after closing the floating Loading window.
I am currently at an Apple Developer Lab where I have been working with an Apple engineer with Unity knowledge to troubleshoot. We experimented further with the Unity VR Core project template and found that adding an
ARSession component to the main scene resulted in the user being loaded into a black void with the Loading window displayed but nothing else renders.
Looking at an Xcode GPU capture I can see that rendering is occurring up to the point that it needs to be presented. You can find the GPU capture here.
I will be at the Apple Developer Lab all week so it would be great to be able to get some help resolving this issue while I still have access to a device.
- Unity 2022.3.9f1
- Xcode 15 Beta 8
- visionOS 1.0 SDK (21N5233e)
- Simulator: visionOS 1.0 (21N5233e)
- Device: visionOS 1.0 (21N5233f)
This is usually due to depth not being written (or cleared before it gets to the compositor) for some reason. Try things like disabling post processing / HDR / any effect that may not work with depth. We have a somewhat simple test project that is run on each release to make sure the basics still work while we scale up our support. Now that you’ve mentioned the template project - we’ve confirmed it’s rendering black on device. We’ll look deeper into it tomorrow and get something similar into our release testing once it’s resolved.
For the template project, we found that render pipeline had depth texture disabled. Changing the camera’s Rendering → Depth Texture setting to “On” fixed it:
Let us know if you have any luck with your project.
Please also note that you’ll need to use
AR Handheld Device instead of
XR HMD for your TrackedPoseDriver on the camera.
Thanks, @thep3000. That did of course solve the issue for me with the template; however, my VR game is using the built-in render pipeline with shadows turned on.
OK, I got my VR game to render on device. The
ARSession component was missing and adding it resolved the issue. Should this component be required for a VR app? Is there documentation I missed to indicate this component is required as it doesn’t seem clear? Although, I suppose it does make sense as the head pose information comes from the AR device.
Yes. AR Session is required for VisionOS VR apps because the concept of a session is shared with AR features. Unfortunately, we’re still catching up on documentation, so I’m not sure this is actually written down anywhere. We’ll be sure to include this detail in future documentation releases.
To be clear, are you still having any issues now that your scene has an AR Session component?
My application now renders but only in the left eye even though all of our shader have SPI rendering enabled and our main shader is a surface shader. I am about to start a new thread for this issue.
That sounds like maybe it could be the
RenderMode setting. If you try to build a multipass app to device, it will only render correctly in one eye. The other eye will be distorted, but not blank/black.
The application is configured to build targeting the Device so the Render Mode is forced to Single Pass Instanced. I did make a build with the Render Mode forced to Multi Pass and I do see the issue you mentioned. Unfortunately, with the Device Target set to Device and the Render Mode set to Single Pass Instanced I am still seeing the right eye rendering completely black.
OK. Rendering black is probably not the RenderMode issue. The Render Mode stuff is ultimately boiled down to the
visionos_config.h header included in the XCode project. If that has
#define VISIONOS_SINGLE_PASS 1 for a device build, then render mode is not the issue.
I think we’ll need a repro project to get to the bottom of this one.
@mtschoen I created a new thread for the left eye only rendering issue and attached a repro project via a Google Drive link there. The repro project is just the standard Unity 3D Core template updated for VR support.
The new thread can be found here.