I’m working on an ocean shader where currently I displace vertices in a compute shader with a simple sine. I pass the vertexPositions and triangleIndices to my unlit shader. However, somehow when I add terrain somewhere in my project, the color used in the fragment shader also gets applied. The shader code is really simple and I don’t know what’s wrong. Here is my code:
Shader “Unlit/OceanShader”
{
Properties
{
// Properties for your shader
}
SubShader
{
Tags {"RenderType" = "Opaque"}
Pass
{
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
// Buffers from the compute shader
StructuredBuffer<float3> vertexPositions;
StructuredBuffer<int> triangleIndices;
struct appdata
{
uint index : SV_VERTEXID; // Used to access the vertex index
};
struct v2f
{
float4 vertex : SV_POSITION;
};
v2f vert(appdata v)
{
v2f o;
uint vertexIndex = triangleIndices[v.index];
float4 vertexPosition = float4(vertexPositions[vertexIndex], 1);
o.vertex = UnityObjectToClipPos(vertexPosition);
return o;
}
fixed4 frag(v2f i) : SV_Target
{
return fixed4(0, 0, 0.5, 0.5);
}
ENDCG
}
}
}
Since the ocean is doesn’t have normals or lighting applied you can’t see the waves but they are there just like you see on the terrain. Very important to note: this only happens in URP and not the standard Unity projects. To summarize: the waves are duplicated on the terrain with the same size as the ocean. Does anyone know why this happens? It doesn’t happen to other objects and I haven’t applied this shader to the terrain. Could this have to do with me calling ‘DrawProceduralNow’ in OnRenderObject instead of a custom rendererfeature?