Deferred shader throws out needed UVs

Ok, I hope this isn’t a double post, because my last one seemed to disappear into the ether.

I am writing a shader that uses 4 pairs of diffuse/normal maps. In an effort to reuse UV coordinates and reduce the number of TEXCOORD interpolators in my shader, I have each normal map use the same UV as its paired diffuse map. For example:

struct Input
{
float2 uv_MainTex;
float2 uv_Diffuse2;
float2 uv_Diffuse3;
float2 uv_Diffuse4;
};

half4 tex2 = tex2D(_Diffuse2, IN.uv_Diffuse2);
half4 normal2 = UnpackNormal(tex2D(_Normal2, IN.uv_Diffuse2));

This trick works fine in forward rendering, but breaks under deferred rendering. I think I know why. When the deferred renderer creates a shader pass for the “Normal/Depth” pass, it tries to optimize by throwing out unused stuff. It (erroneously, in my case) assumes that because the _Diffuse2 texture is unused, then the IN.uv_Diffuse2 UV is also not needed. The result is that (under deferred) my diffuse textures all tile as expected, but the normal maps do not.

To test my theory, I reversed it so the diffuse textures used the normal map UVs. As expected, the diffuse textures no longer tiled, but the normal maps did.

Ideally, the compiler would recognize which UV interpolators are actually used, rather than (apparently) discarding them based on some naming convention.

Until then, are there any work-arounds for this? One possibility might be a #define that indicates which pass is currently being compiled. I could then modify my code to do something like this:

struct Input
{
#ifdef MATERIAL_PASS
float2 uv_MainTex;
float2 uv_Diffuse2;
float2 uv_Diffuse3;
float2 uv_Diffuse4;
#endif
#ifdef NORMAL_DEPTH_PASS
float2 uv_NormalMap;
float2 uv_Normal2;
float2 uv_Normal3;
float2 uv_Normal4;
#endif
};

#ifdef MATERIAL_PASS
half4 tex2 = tex2D(_Diffuse2, IN.uv_Diffuse2);
half4 normal2 = UnpackNormal(tex2D(_Normal2, IN.uv_Diffuse2));
#endif
#ifdef NORMAL_DEPTH_PASS
half4 tex2 = tex2D(_Diffuse2, IN.uv_Normal2);
half4 normal2 = UnpackNormal(tex2D(_Normal2, IN.uv_Normal2));
#endif

Thoughts?

Just pinging this again in hopes someone has a workaround. The only other option I can think of is to reduce the number of textures to match the number of interpolators it allows. This seems silly because my current version is eliminating duplicate interpolators, thus improving performance and expanding functionality. :frowning:

Ping

Looks like this is not going to be fixed, so I have a work-around:

Create your own tile values as custom parameters and do the tile and/or offset computations per pixel in the pixel shader. Leave the tile/offset values at 1 and 0 respectively on all of the textures (the default ones that appear next to your texture slots). Do the tile and/or offset computations per pixel in the pixel shader. This way, you only need to pass in one diffuse/normal pair of UV coordinates (or two pair if you also use the 2nd UV set).

I hope that description if clear. It’s not ideal, but it works.

For me it is even stranger… Multiply UV sets in my shader are not working even in forward mode.
I am trying this and
It behaves just like all the samplers use the lightmap uv set while I use 3 different uv sets in 3DSMAX.

  Shader "MyStuffs/BumpDiffuseLightMap" {
    Properties {
      _MainTex ("Texture", 2D) = "white" {}
      _LightMap ("LightMap", 2D) = "gray" {}
      _BumpMap ("Bumpmap", 2D) = "bump" {}      
    }
    SubShader {
      Tags { "RenderType" = "Opaque" }
      CGPROGRAM
      #pragma surface surf Lambert
	  
      struct Input {
          float2 uv_MainTex;
	  float2 uv_LightMap;
          float2 uv_BumpMap;		  
      };
      sampler2D _MainTex;      
      sampler2D _LightMap;
      sampler2D _BumpMap;
	  
      void surf (Input IN, inout SurfaceOutput o) {
          o.Albedo = tex2D (_MainTex, IN.uv_MainTex).rgb;      
	  o.Albedo *= tex2D (_LightMap, IN.uv_LightMap).rgb * 4;
          o.Normal = UnpackNormal (tex2D (_BumpMap, IN.uv_BumpMap));
      }
      ENDCG
    } 
    Fallback "Diffuse"
  }

I am confused…
Am I doing something wrong ?

First, the bad news: Unity only allows you to use 2 UV sets total. You can then use the texture scale and offset values to modify the values given in the two sets.

Your Input struct is almost right, but all of your textures will be driven by UV set 1. To specify UV set 2, replace uv_ with uv2_ for whichever inputs need to use the second UV set. For example, “float2 uv_LightMap;” would become “float2 uv2_LightMap;”

That should get you using UV set 2, but sadly there is no support for a third set.

ok great.
Thank you, BStringham