I am using AR Foundation 5.0.0-pre.12, Unity 2022.1.+ and URP 13.1.8 and I am again getting the ‘black screen’ issue on iOS. This has cropped up before in previous combinations of URP and AR Foundation, and in my case I also get a black screen with AR Foundation 4.2.3 release version.
The camera is running and plane recognition is working, so I know the problem is related to the background rendererer. This thread has screenshots to check all the settings, and I have done that.
Is this a known bug? Are there any ways to workaround this and get background rendering working again?
It may be a known issue with URP. Double check that the active render pipeline has the renderer feature and check the logs during runtime to make sure that there isn’t an IndexOutOfBoundsException being thrown by the renderer. If there is you can resolve it by adding your renderer again in the URP asset then removing it (this is a workaround). The editor has been seeing some issues with the default option in serialized lists being at the incorrect index on startup.
If you have run the URP Pipeline conversion utility that actually replaces your active render pipeline with default pipelines created by URP to emulate the built-in renderer. You would need to add the ARBackgroundRendererFeature to the pipelines that were generated.
Otherwise, please file a bug here and post the tracking link:
The active pipeline does have the renderer feature, but I do not see any console or log messages for index out of range errors. I selected the ‘Balanced’ renderer pipeline and removed the AR Background Renderer and then re-installed it a number of times and it makes no difference.
I have noticed the Editor occassionally losing track of the selected pipeline and also occassionally the Editor randomly unchecks the ARKit system. But, even with all those selected/checked, still get a black screen.
It is caused by a bug in URP which is being fixed.
As a workaround, AR Foundation 5.0.0-pre.12 added a new feature to set the camera background rendering order to “After Opaque”. Try setting the Rendering Mode of ARCameraManager to “After Opqaue” in the inspector or use this API ARCameraManager.requestedBackgroundRenderingMode = CameraBackgroundRenderingMode.AfterOpaques.
I solve this problem by switch “Render Mode” in ARCameraManager to After Opaques
also i check my URP Asset and made sure for values in Rendering Path n DepthTexture Mode have been a Forward and After Opaques too.
Its worked for me.
As i know from this https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@5.0/manual/migration-guide-5-x.html URP Version 14.0.2 Incompatibility
When using URP version 14.0.2 there is a bug that causes URP to not respect RendererFeature input requests. This means that even though the ARCameraBackground might request for a Camera Depth Texture, URP will not respect this request and the camera background will overwrite geometry.
To workaround this, make sure you are using URP version 14.0.3 or greater.
Editor Version: 2022.2.0b14 AR Foundation: 5.0.2 URP: 14.0.3
Any update on a fix? This is a horrible user experience, I also found in 2022.2.1f1 using After Opaques on AR Camera Manager and in the URP asset, actual geometry in the scene would not render and only the background camera would show when in Editor using XR Simulation.
Edit: I can confirm this is fixed in 2022.2.3f1 using URP 14.0.4 and AR Foundation 5.0.3, I had to do this to roll back and get it working by reversing the fixes from above:
AR Camera Manager - Render Mode set to ‘Any’
On the URP asset - Rendering > Render Path Forward, Depth priming Disabled (default I think) and Depth Texture Mode: After Opaques
Yes sorry, definitely got lost in the noise. The fix is available in later URP versions. A default URP ARFoundation project should not have anymore difficulties.
Here’s a fun one - with URP in 2022.2.3f1, ARF 5.0.3, URP 14.0.5, if URP Asset > Opaque layer mask isn’t set to ‘everything’ - in this example I was using Render Feature - Render Objects to make stencils and masks for AR portals, the XR Sim background doesn’t render.
I am confident that if there was a URP branch, some more of these issues would get caught, especially if we assume that Built-In Render pipeline is going to be deprecated for the upcoming URP/HDRP combo refactor. For now it is a real pain in the ass that we have so many render pipelines and bugs split across them. URP is a godsend for shader graph and renderer features!
I have also found with Unity 2022.2.4f1, the XR Simulation Environments render on the M1 Mac Silicon Editor but on Windows Editor the environment renders black and I get this error:
d3d11: failed to create 2D texture shader resource view id=1928 [D3D error was 80070057]
I will post back the bug report ID for both these bugs.
XR Simulation environments are rendered on Layer 30 by default. If you name Layer 30 you can mask it correctly using Editor UI workflows like this, or change the layer that is used in your XR Simulation Project Settings: XR Simulation project settings | AR Foundation | 5.0.7.
There was some debate internally about automatically naming the layer as it could conflict with your existing settings, but I’ll revive that conversation as our current solution is problematic for reasons you describe. (And indeed if there is a layer conflict, not naming the layer doesn’t solve the issue but simply obscures it)
Thanks for this report. To clarify, this is Windows 2022.2.4f1 with URP?