Hi guys! I originally posted this on the shader lab forums but didn’t get a response, so I will try I here now. I’ve been struggling with this since 4.0 came out, and as far as I can tell, this hasn’t been resolved yet.
How does one write to a RWTexture in a pixel shader, using the regular camera rendering? Has anyone ever gotten this to work? Checking out Aras’s examples, he only ever uses Graphics.Blit for his example. Searching through the forums leads me to believe that Graphics.SetRandomWriteTargets doesn’t work with camera rendering, at least not for RWTextures.
I can post up my code if you guys would like, but I’d really like to get some confirmation whether this works or not. I’ve been able to fudge this somewhat by using an RWStructuredBuffer (which works with UAV writes in a pixel shader), but I would really enjoy the convenience of using a RWT.
Thanks!
I am also wondering about this. It seems like writing to a RWTexture2D in a pixel shader is possible; I have succeeded in doing this. But when it comes to writing to a RWTexture3D, it doesn’t seem to work at all.
Shader "Custom/RenderToVolume"
{
Properties
{
_MainTex ("Diffuse (RGBA)", 2D) = "white" {}
}
SubShader
{
Pass
{
Tags { "RenderType"="Opaque" }
Cull Off ZWrite Off ZTest Always Fog { Mode Off }
CGPROGRAM
#pragma target 5.0
#pragma vertex vert
#pragma fragment frag
#pragma exclude_renderers flash gles opengl
#include "UnityCG.cginc"
sampler2D _MainTex;
RWTexture3D<float4> volumeTex;
float volumeResolution;
float4 volumeParams;
struct ApplicationToVertex
{
float4 vertex : POSITION;
float4 texcoord : TEXCOORD0;
};
struct VertexToFragment
{
float4 pos : SV_POSITION;
float4 wPos : TEXCOORD0;
float2 uv : TEXCOORD1;
};
void vert(ApplicationToVertex input, out VertexToFragment output)
{
output.pos = mul (UNITY_MATRIX_MVP, input.vertex);
output.wPos = mul(_Object2World, input.vertex);
output.uv = input.texcoord.xy;
}
void frag(VertexToFragment input)
{
float4 color = tex2D(_MainTex, input.uv);
float3 volumePos = ((input.wPos.xyz - volumeParams.xyz) / volumeParams.w) * volumeResolution;
int3 volumeCoords;
volumeCoords.x = int(volumePos.x);
volumeCoords.y = int(volumePos.y);
volumeCoords.z = int(volumePos.z);
volumeTex[volumeCoords] = color;
}
ENDCG
}
}
Fallback Off
}
This is what I am trying to do, and it doesn’t seem to work, but if I make it a 2D texture, it does. Not sure where I am messing it up.
So, I was able to contact Aras (Unity Graphics Dev), and he believes that there is not currently support for writing to RWTexture3D in shaders, but you CAN write to them in Compute Shaders. You can also write to ComputeBuffers in shaders. I will have to write to ComputeBuffer in my shader, and then use a Compute Shader to copy the contents into a Texture3D with RWTexture3D.
1 Like
Is there any new way of doing this by now, or do we still have to use the ComputeBuffer workaround?
// EDIT: Happy to report that this works by now using Graphics.SetRandomWriteTarget(1, yourTexture);
and RWTexture3D<float4> _YourTexture;
in a pixel shader (I’m using #pragma target 5.0
under DirectX 11).