Hi guys, I’m trying to read from a render texture in a fragment shader but I’m having trouble getting the right texture coordinates from the fragment’s coordinates. I did a search but I couldn’t make much of the results I found. Thanks in advance.
This is my shader file:
Shader "Custom/FrontShader" {
Properties {
_VolTex ("Texture", 3D) = "" {}
_BackTex ("Texture", 2D) = "white" {}
}
SubShader {
Pass
{
Cull Back
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
struct vs_input
{
float4 vertex : POSITION;
};
struct ps_input
{
float4 pos : SV_POSITION;
float3 uv: TEXCOORD0;
float4 screenCoord : TEXCOORD1;
};
ps_input vert(vs_input v)
{
ps_input o;
o.pos = mul(UNITY_MATRIX_MVP, v.vertex);
o.uv = v.vertex.xyz * 0.5 + 0.5;
o.screenCoord = ComputeScreenPos(o.pos);
return o;
}
sampler3D _VolTex;
sampler2D _BackTex;
float4 frag(ps_input i) : COLOR
{
float4 backCol = tex2D(_BackTex, i.screenCoord.xy);
return backCol;
}
ENDCG
}
}
FallBack "Diffuse"
}
Not sure if this helps but screen coordinates/fragment coordinates are like pixel coordinates, so they will range from 0…resolution width and height. If you directly use that as uv coords it’s going to wrap the texture hundreds/thousands of times as if the texture were 1x1 pixel. You need to scale them? Although I’m not sure what the ComputeScreenPos function is doing. But also you’re reading the _BackTex using the screencoord data and not even using the uv coords you output.
inline float4 ComputeScreenPos (float4 pos) {
float4 o = pos * 0.5f;
#if defined(UNITY_HALF_TEXEL_OFFSET)
o.xy = float2(o.x, o.y*_ProjectionParams.x) + o.w * _ScreenParams.zw;
#else
o.xy = float2(o.x, o.y*_ProjectionParams.x) + o.w;
#endif
#if defined(SHADER_API_FLASH)
o.xy *= unity_NPOTScale.xy;
#endif
o.zw = pos.zw;
return o;
}
This is what ComputeScreenPos is doing, it’s from UnityCG.cginc. I’m not sure what its output is either, whether it’s supposed to be in clip coordinates or not but I read that I should use it because Unity sometimes has its matrices in column-major or row-major order depending on if DX or GL is used. I couldn’t find any proper documentation for it. I tried converting the coordinates to texture space myself but the coordinates were still wrong even when I forced the screen resolution to match the render texture’s resolution.
I know, I need two sets of texture coordinates.
EDIT:
http://ru.unity3d-docs.com/Documentation/Components/SL-VertexFragmentShaderExamples.html
I finally managed to find this. It seems I just had to divide screenCoord.xy by screenCoord.w. It still doesn’t work perfectly for screen resolutions that don’t match the render texture resolution but it will do for now.