I’m trying to do a shader where the screen-space position of the fragment affects the output. To test, I output the coordinates directly. Of course, in editor mode and standalone builds this results in yellow (1, 1, <0, 1), great, that’s expected.
However, in Webgl builds, it looks like the coordinates are always zero (always renders flat black). What’s going on? Do I need to do some additional steps in the fragment shader in webGL mode?
My standalone mode is using DirectX11, and webGL is webGL 1.0, which I believe is OpenGL ES 2.0
Here is the shader (Some commented out code removed so there’s some leftover properties)
Shader "WaterLAB/UI/Selection Box"
{
Properties
{
_MainTex("Main Texture", 2D) = "white" {}
}
SubShader
{
Tags { "RenderType"="Opaque" "Queue"="Overlay" }
LOD 100
ZTest Always //Draw no matter what
ZWrite Off
Blend SrcAlpha OneMinusSrcAlpha //Alpha Blending
Pass
{
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
struct appdata
{
float4 vertex : POSITION;
float2 uv : TEXCOORD0;
};
struct v2f
{
float2 uv : TEXCOORD0;
float4 vertex : POSITION;
};
sampler2D _MainTex;
float4 _MainTex_ST;
v2f vert (appdata v)
{
v2f o;
o.vertex = mul(UNITY_MATRIX_MVP, v.vertex);
o.uv = TRANSFORM_TEX(v.uv, _MainTex);
return o;
}
fixed4 frag (v2f i) : SV_Target
{
//Blue channel is multiplied by -1 to prove the x input is always 0. Otherwise we would get a non-black color.
return fixed4(i.vertex.xy, i.vertex.x * -1, 1);
}
ENDCG
}
}
FallBack "Diffuse"
}
Edit: Typing it out, I had a thought, and set the windows graphics API to OpenGL ES 2.0 so I could more closely “emulate” the webGL mode. and it indeed it’s also rendering black there to. So at least testing webGL builds should be faster