Is there a way to binding a RenderTexture miplevel as a Shader Texture?

I try to create a eye view linear depth RenderTexture A which have 5 mipmap.
First i convert the scene depth buffer to linear depth and save it to A’s mip level 0.
Then i want use a custom material to downsample the A’s mip level 0 to mip level 1.
I want binding A’s mip level 0 as a shader texture, and use A’s mip level 1 as a RenderTarget.
But it seems don’t work, the shader texture is null.
Did anyone know how to set it?
Thanks.

RenderTargetIdentifier mipLevelRT = new RenderTargetIdentifier(LinearDepthTex, 0);
cmd.SetGlobalTexture("_LinearDepth", mipLevelRT);
cmd.SetRenderTarget(LinearDepthTex, 1);

I assume you want to do some custom filtering or something during downsampling? because otherwise you might just use Unity - Scripting API: RenderTexture.GenerateMips

If you do want to do it with your own shader, I’d just get 5 RenderTextures using Unity - Scripting API: RenderTexture.GetTemporary and then use Unity - Scripting API: Graphics.Blit in a loop to use your downsampling shader to fill in the smaller versions and finally create a new Texture2D and load in the rendertexture data as the individual mipmap levels.

Thanks, this method works, but need more time to copy from 5 Temp RT to final RT.
I’m sure if use DX or OpenGL, we can direct render to mip 1, and binding mip 0 as shader texture, don’t need copy pass.
Unity API seems support that also, but it don’t work.
I want know the reason.

What do you mean by [quote=“chena_cpp, post:1, topic: 874418, username:chena_cpp”]
… the shader texture is null.
[/quote]
exactly?
is your mipLevelRT variable null, or do you just get garbage or black when sampling from sampler2D _LinearDepth; in your shader?

How exactly are you filling that RenderTexture with the linear depth and what format does it use? Also, does this base RenderTexture have a antiAliasing value larger than 1? If so, you might need to call Unity - Scripting API: RenderTexture.ResolveAntiAliasedSurface before reading from the sampler2D.

I would just bind the texture and sample from the required LOD in the shader.

Cant remember how to in unity but would use tex2Dlod in hlsl.

I use renderdoc to debug, the texture in the drawcall is empty.
cmd.SetGlobalTexture(“_LinearDepth”, mipLevelRT); should binding the texture.
And in renderdoc i found the corresponding DX API call is PSSetShaderResources(0, { No Resource }).
So it failed to binding the texture.

I just render to the _LinearDepth to fill it, format is RenderTextureFormat.RHalf, antiAliasing is 1.

I think Unity didn’t create dedicated D3D shader resource view for each mipmap. Did you tried to implement it in compute shader with RWTexture2D?

compute shader certainly provide more flexibility but it’s worth to note that render texture compression is disabled in compute pipeline so that it may cost more GPU memory bandwith with mipmap texture generated by compute shader

Thanks, i also want it run on mobile (opengl es 3.0), so i can’t use compute shader.