So I wrote a surface shader system, which is currently compiling shaders into Standard and URP, and HDRP should be working soon. I think it’s much cleaner than the old surface shaders system, which has a lot of kookiness around screenPos, viewDir, normals, etc. And it’s also modular, allowing you do something like #include’s, but where they can bring properties, cbuffer entries, and other stuff along with it.
As an example, here’s a basic shader with tessellation support:
BEGIN_OPTIONS
Tessellation "distance"
END_OPTIONS
BEGIN_PROPERTIES
_Albedo ("Albedo", 2D) = "white" {}
_Normal ("Normal", 2D) = "bump" {}
_Height ("Height Map", 2D) = "black" {}
_DisplacementAmount("Displacement Amount", Range(0,2)) = 0.5
_DisplacementMipBias("Displacement Mip Bias", Range(0,6)) = 2
_TessSubdiv("Tessellation Subdivisions", Range(2, 24)) = 8
_TessMinDistance("Tessellation Min Distance", Float) = 0
_TessMaxDistance("Tessellation Max Distance", Float) = 35
END_PROPERTIES
BEGIN_CBUFFER
float _DisplacementAmount;
float _DisplacementMipBias;
float _TessSubdiv;
float _TessMinDistance;
float _TessMaxDistance;
END_CBUFFER
BEGIN_CODE
sampler2D _Albedo;
sampler2D _Normal;
sampler2D _Height;
// (optional)modify the vertex post tessellation
void DisplaceVertex(inout VertexData v)
{
v.vertex.xyz = v.vertex.xyz + v.normal * tex2Dlod(_Height, float4(v.texcoord0.xy, 0, _DisplacementMipBias)).g * _DisplacementAmount;
}
// (optional) if you are using tessellation and displacement, you can return
// the tessellation distance and subdivision here
float3 GetTessDistanceFactors ()
{
return float3(_TessMinDistance, _TessMaxDistance, _TessSubdiv);
}
void SurfaceFunction(inout LightingInputs o, ShaderData d)
{
half4 c = tex2D(_Albedo, d.texcoord0.xy);
o.Albedo = c.rgb;
o.Alpha = c.a;
o.Normal = UnpackNormal(tex2D(_Normal, d.texcoord0.xy));
}
END_CODE
This will compile to all three render pipelines, and acts just like any other shader in your system. Note that you don’t write the v2f and other traditional structures- rather the system uses a naming convention and constructs them for you based on if you access that data. So, for instance, if you read d.TangentSpaceViewDir, then it will be provided for you.
Here is an example shader documenting the current data and options that are available (this will grow):
BEGIN_OPTIONS
// ShaderName "Path/ShaderName" // The default will just use the filename, but if you want to path/name your shader
// Tessellation "Distance" // automatic tessellation, distance, edge, phong
// Alpha "Blend" // use alpha blending?
// Fallback "Diffuse" // fallback shader
// CustomEditor "MyCustomEditor" // Custom Editor
// RenderType "Opaque" // render type
// Queue "Geometry+100" // forward rendering order
// Workflow "Metallic" // Specular or Metallic workflow, metallic is default
END_OPTIONS
// Put any properties you have between the begin/end property blocks
BEGIN_PROPERTIES
_Color ("Main Color", Color) = (0, 1, 0, 1)
END_PROPERTIES
// Any variables you want to have in the per material CBuffer go here.
BEGIN_CBUFFER
half4 _Color;
END_CBUFFER
// if you are writing a subshader, any defines that should be set on the main
// shader are defined here
BEGIN_DEFINES
END_DEFINES
// All code goes here
BEGIN_CODE
// (optional) if you want to modify any vertex data before it's processed,
// put it in the ModifyVertex function. The struct is:
// struct VertexData
// {
// float4 vertex : POSITION;
// float3 normal : NORMAL;
// float4 tangent : TANGENT;
// float4 texcoord0 : TEXCOORD0;
// float4 texcoord1 : TEXCOORD1;
// float4 texcoord2 : TEXCOORD2;
// float4 texcoord3 : TEXCOORD3;
// float4 vertexColor : COLOR;
// };
// (optional)modify the vertex
void ModifyVertex(inout VertexData v)
{
}
// (optional)modify the vertex post tessellation
void DisplaceVertex(inout VertexData v)
{
}
// (optional) if you are using automatic tessellation and displacement, you can return
// the tessellation distance and subdivision here
float3 GetTessDistanceFactors ()
{
float minDistance = 0;
float maxDistance = 35;
float subDiv = 12;
return float3(minDistance, maxDistance, subDiv);
}
// (required) Write your surface function, filling out the inputs to the
// lighting equation. LightingInputs contains:
// struct LightingInputs
// {
// half3 Albedo;
// half3 Normal;
// half Smoothness;
// half Metallic; // only used in metallic workflow
// half3 Specular; // only used in specular workflow
// half Occlusion;
// half3 Emission;
// half Alpha;
// };
// The SurfaceData function contains common data you might want, precomputed
// for you. Note the system strips unused elements from the structures automatically,
// so there is no cost to unused stuff.
// struct ShaderData
// {
// float3 LocalSpacePosition;
// float3 LocalSpaceNormal;
// float3 LocalSpaceTangent;
// float3 WorldSpacePosition;
// float3 WorldSpaceNormal;
// float3 WorldSpaceTangent;
// float3 WorldSpaceViewDir;
// float3 TangentSpaceViewDir;
// float4 texcoord0;
// float4 texcoord1;
// float4 texcoord2;
// float4 texcoord3;
// float2 screenUV;
// float4 screenPos;
// float3x3 TBNMatrix;
// };
void SurfaceFunction(inout LightingInputs o, ShaderData d)
{
o.Albedo = _Color.rgb;
o.Alpha = _Color.a;
}
END_CODE
Note that even though I have automatic ways to do things like tessellation, it’s still possible to write your own versions of those functions if you want to do that, or to add geometry shaders, compute buffer data, etc. There’s significantly less restraints and assumptions than the old surface shader system had, and the naming conventions are clear about things like “what space is this in?”, and there’s no funky “This will be in this space on Tuesdays, but on Wednesdays it will return NaN, and on Friday it will be whatever the value in o.pos is”, and you don’t have to do funky stuff to get at the TBN matrix, it’s just there, where you can access it. Crazy, right?
Shaders are also modular. One shader can include another, and it will bring in it’s properties, cbuffer entries, etc. Right now this is only handled via code, but it would be possible to provide this via a scriptable object, such that users could add “Snow” to an existing shader. Obviously there are limits to how far you can push this, but adding weather effects to existing shaders is a perfect example that has been an issue for a lot of games in the past.
So main benefits:
- Simple, constant way to write shaders
- Write once, run on standard, URP, or HDRP
- Shaders automatically upgrade between SRP versions (*assuming I have done the support for them)
- Can write features as separate shaders, then plug them together.
So, all that said, it would be interesting to know some of the following, assuming this is something that interests you:
- Which SRP do you use, if any?
- What use cases do you write custom or surface shaders for?
- What features did you find limiting in Surface Shaders that you would want to have control over?
- Which unique shading features do you use in your SRP (ie: Bent normals, SSS, etc), and how would you like to handle fallbacks in other pipelines (ie: Approximation of SSS in URP, don’t bother, etc).
I also have a lot of different thoughts about how to sell this. I will most likely move MicroSplat over to using this system instead of it’s current render adapters and provide an upgrade there, as maintenance of systems like these is a huge potential cost. Last year, supporting SRPs was about half of my development time, so having only one system to abstract these issues makes a ton of sense.
Anyway, thoughts welcome.