I’m Trying to sample the camera depth texture inside a compute shader for occlusion culling.
I’ve been fetching the texture via Shader.GetGlobalTexture("_CameraDepthTexture")
and assigning it simply with computeShader.SetTexture()
.
My code worked with Unity 2021
and prior but after the update to Unity 6
, it didn’t anymore. After debugging the texture with a raw image
I can see that the depth texture doesn’t seem to be generated. The texture name is UnityBlack
and has a size of 4px
x 4px
and is, as the name suggests, completley black.
What I’ve already setup:
- set the
Depth Texture
mode to On
on the camera.
- checking the
Depth Texture
on the Render pipeline asset
- setting
_cam.depthTextureMode = DepthTextureMode.Depth;
via script
What else am I missing?
I’ve also used the frame debugger
and I can see that the CopyDepth
pass seems to have a correct depth texture.
Hi
In Unity 6 we introduced the new Render Graph system to URP.
With that, we have changed how global textures are handled:
“Additionally, we are improving the use of global textures in URP. We’ve discovered a number of bugs where a material in a Render Pass uses a global texture that hasn’t been set yet in the current frame. This error is hidden because the pass picks up the global texture from the previous frame. These issues are very hard to track down, since the content from the previous frame is the same as the current frame in static test scenarios. […] To avoid these hard to debug problems altogether, we will automatically unset any global texture that has been set using the RG API” (source).
This is now the default behavior in Render Graph, that global textures will be automatically unset (set to a black texture) after the RenderGraph is done executing.
We have several samples how to properly access the resources and work with Render Graph. And the documentation linked above helps you understand the system. Just open the Package Manager and navigate to the URP package. There is a samples tab, where you can import the “URP Render Graph Samples”:
There you will find the OutputTexture sample, that can help you. It shows how to get the different URP resources (e.g. cameraDepthTexture) and assigned them to a texture in a material. See the code here.
You can also use the big Render Graph thread to ask further question specifically to that topic.
Best,
Oliver
Thank you for the answer. I’ve looked at the OutputTexture
example but I can’t quite figure out how I can access the depth texture in my compute shader.
I condensed it down to the most relevant parts for me but I can only get the TextureHandle
instead of the Texture
.
Also would it be correct if I reassign the global shader texture in this pass to set it via script for my compute shader? I’m not sure if that’s the right way to do it.
Would be great if you or someone could help me out again on this.
public class DepthTextureFeature : ScriptableRendererFeature
{
// Pass which outputs a texture from rendering to inspect a texture
class OutputTexturePass : ScriptableRenderPass
{
// Function set setup the ConfigureInput() and transfer the renderer feature settings to the render pass.
public void Setup()
{
ConfigureInput(ScriptableRenderPassInput.Depth);
}
// Records a render graph render pass which blits the BlitData's active texture back to the camera's color attachment.
public override void RecordRenderGraph(RenderGraph renderGraph, ContextContainer frameData)
{
// Fetch UniversalResourceData from frameData to retrive the URP's texture handles.
var resourceData = frameData.Get<UniversalResourceData>();
// Sets the texture handle input using the helper function to fetch the correct handle from resourceData.
var source =resourceData.cameraDepthTexture;
if (!source.IsValid())
{
Debug.Log("Input texture is not created. Likely the pass event is before the creation of the resource. Skipping OutputTexturePass.");
return;
}
Debug.Log(resourceData.activeDepthTexture);
// RenderGraphUtils.BlitMaterialParameters para = new(source, resourceData.activeColorTexture, m_Material, 0);
// para.sourceTexturePropertyID = Shader.PropertyToID(m_TextureName);
// renderGraph.AddBlitPass(para, passName: "Blit Selected Resource");
}
}
// Inputs in the inspector to change the settings for the renderer feature.
[SerializeField]
readonly RenderPassEvent m_PassEvent = RenderPassEvent.AfterRenderingOpaques;
OutputTexturePass m_ScriptablePass;
/// <inheritdoc/>
public override void Create()
{
m_ScriptablePass = new OutputTexturePass();
// Configures where the render pass should be injected.
m_ScriptablePass.renderPassEvent = m_PassEvent;
}
// Here you can inject one or multiple render passes in the renderer.
// This method is called when setting up the renderer once per-camera.
public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
{
// Setup the correct data for the render pass, and transfers the data from the renderer feature to the render pass.
m_ScriptablePass.Setup();
renderer.EnqueuePass(m_ScriptablePass);
}
}
Okay small update. I sort of got it working but with a different approach.
Inside OnRenderObject
, the depth texture is still accessible as a global texture. I can assign it to my compute shader and do the occlusion culling.
After this is done, I can correctly render all my meshes for Graphics.DrawMeshInstancedIndirect
during the Update
loop.
The only downside to this is that now the instances seem to flicker every now and then but I couldn’t figure out why yet.
Edit:
The flicker occours because Update
is called before OnRenderObject
starts. But I solved it by only fetching the depth texture in OnRenderObject
and keeping the logic inside Update
. I only need to check if the texture is not null before I start rendering.