Hi there,
i’m trying to write a CG shader for Unity and i got strange results when trying to pass
the texture coordinates to the fragment shader.
this is my code :
Shader "TestCG_01" {
Properties {
}
Subshader {
Pass {
Blend One Zero
CGPROGRAM
// Upgrade NOTE: excluded shader from Xbox360; has structs without semantics (struct v2f members texcoord,texcoord1,wnormal,wrefl,view,color)
#pragma exclude_renderers xbox360
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
struct appdata {
float4 texcoord :TEXCOORD0;
float3 normal : NORMAL;
float4 vertex : POSITION;
};
struct v2f {
float2 uvcoord1;
float4 pos : SV_POSITION;
float3 wnormal;
};
v2f vert (appdata v)
{
v2f o; //create the output struct
o.wnormal = mul(_Object2World, float4(normalize(v.normal),0)).xyz; //put the normal in world space pass it to the pixel shader
o.pos = mul( UNITY_MATRIX_MVP, v.vertex ); //put the position in worldview space pass it to the pixel shader
//o.uvcoord1 = v.texcoord.xy; // weird !!!! changes the value of wnormal passed along ???
return o;
}
float4 frag (v2f i) : COLOR
{
float4 col = float4(1.0,0,0,0);
col.xyz = i.wnormal * 0.5 + 0.5;
col.w = 1.0;
return col;
}
ENDCG
}
}
}
attached to the post there are the two different results i get with the line "//o.uvcoord1 = v.texcoord.xy "
commented out = image called “without_uv_passing.jpg”

which is CORRECT : i get to see the color of the normal in world space = GOOD
and with the line “o.uvcoord1 = v.texcoord.xy”
in the code = “with_uv.jpg”

and this happens WITHOUT changing the code of the fragment shader,
which leaves me a bit confused and disappointed.
the fragment is still setting the final color (based on the Blend One Zero)
based on wnormal
??? the UV are getting in the way of the color ?
are changing my wnormal ?
there is probably something major that i’m missing …
any clue anyone ?
please help me.
ciaoooooo
cp
Firstly, that doesn’t look like world space normal to me. World space normal should be red/green/blue on the + X/Y/Z side of the model and black on the negative facing sides (or not fading entirely to black as you’ve normalized yours).
So that suggests to me that there’s something more broken there.
Is there a reason you’re not using “appdata_base”? It does the same things you’re trying but it’s known to work so it might be a better idea to use that - at least first while you eliminate possible errors.
I’m not sure what difference it makes, but try putting the o.pos definition into the first line below v2f o; - all shaders seem to have that as the first line, not sure if any of Unity’s matrixes derive things from that?
The issue is probably that you are not using all of your v2f members in both vertex and fragment functions. Each function is compiled as a separate program, and optimized heavily. Everything that is not used gets taken out, but this process only looks at each function separately. The result is that if you ignore a v2f member in one function, and use it in the other, the v2f struct will not match in the compiled programs. This leads to data being misaligned, with seemingly innocuous changes in one function having drastic results in your final output.
Hi,
Thanks for the help.
That’s a world normal, only the normals aren’t interpolated if the scaling of the primitive
is an integer as you can see here:

compared with this one that looks more like a worldnormal to us:

(of course the scaling is related to the sphere)
and i’m not using the “appbase” as input structure 'cause this is a simplification of my shader that
in reality is using more variables and two sets of uv coordinates.
of course i tried to move o.pos in EVERY possible position but nothing happened.
many thanks anyways.
ciaoooooo
Hi, thanks for the help.
i tried to use ALL the variables in the structs and nothing changed:
[SIZE="1"]Shader "TestCG_01" {
Properties {
}
Subshader {
Pass {
Blend One Zero
CGPROGRAM
// Upgrade NOTE: excluded shader from Xbox360; has structs without semantics (struct v2f members texcoord,texcoord1,wnormal,wrefl,view,color)
#pragma exclude_renderers xbox360
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
struct appdata {
float4 texcoord :TEXCOORD0;
float3 normal : NORMAL;
float4 vertex : POSITION;
};
struct v2f {
float4 pos : SV_POSITION;
float2 uvcoord1;
float3 wnormal;
};
v2f vert (appdata v)
{
v2f o; //create the output struct
o.wnormal = mul(_Object2World, float4(v.normal,0)).xyz; //put the normal in world space pass it to the pixel shader
o.pos = mul( UNITY_MATRIX_MVP, v.vertex ); //put the position in worldview space pass it to the pixel shader
o.uvcoord1 = v.texcoord.xy; // wierd !!!! changes the value of wnormal passed along ???
return o;
}
float4 frag (v2f i) : COLOR
{
float4 col = float4(1.0,0,0,0);
col.xyz = i.wnormal * 0.5 + 0.5;
float2 tempuv = i.uvcoord1;
float4 temppos = i.pos;
col.w = 1.0;
return col;
}
ENDCG
}
}
}[/SIZE]
any other idea ?
thanks.
ciaooooo
cb
SOLVED !
Despite what stated here :
http://docs.unity3d.ru/Components/SL-V2Conversion.html
specially when it says
" …and remove bindings to individual TEXCOORDs as well … "
which i assumed it was still valid nowadays, the problem was in the SEMANTICS of
the structure v2f.
i changed it like follows:
struct v2f {
float4 pos : SV_POSITION;
float4 uvcoord1 :TEXCOORD0;
float3 wnormal :TEXCOORD1;
};
and everything worked perfectly.
thanks to everyone,
hope this post can help others.
ciaoooooo
cp