Combining textures on iPhone?

I’m rendering a player-customizable object with three texture “layers” a la photoshop, with each layer being lerp-blended based on its alpha value. Each layer has a tint color.

I’m seeing serious performance problems and am looking for solutions. One obvious one would be to have one texture and combine all the layers into that texture each time the player changes something. Unfortunately Unity iPhone doesn’t support RTT.

So the options I can see are:

  • Make a cpu-based texture combiner native function
  • Make an RTT gpu-based texture combiner. (Is it possible to pass texture IDs between Unity iPhone a native plugin?)
  • Use GetPixels/SetPixels and code a C# texture combiner. (Probably waaaay too slow)

What say you, sirs and madams?

unity iphone 1.x has no RTT support at all as you realized and no you can’t pass out texture ids to C code to handle it outside.

all options as such will internally be solutions equal to 3, where the problem is not set and get, but apply which reuploads the texture etc

Ok, thanks dreamora. So I guess I’m left with optimizing the shader. I’ve got it down to two passes, with a method that works great under iPhone (MBXlite) emulation in the editor, but which fails when deployed:

Shader "LayerShader" 
{
	Properties 
	{
		_Layer1 ("Layer 1", 2D) = "white" {}
		_Layer2 ("Layer 2", 2D) = "white" {}
	}
	SubShader 
	{
		ZWrite On

		Pass {
			Material { Diffuse (0, 0, 0, 0) Ambient (0, 0, 0, 0) Emission (1, 0, 0, 1) }

			ZTest Less
			Lighting On
			Blend Off
			SetTexture[_Layer1] { constantColor (0, 1, 0, 1) combine constant lerp (texture) primary }
			SetTexture[_Layer2] { constantColor (0, 0, 1, 1) combine constant lerp (texture) previous }
		}

		// Second pass that handles lighting etc
	}
}

This is a cleaned up version to illustrate the point. What seems to happen when running the shader on the iPhone is that it can’t handle different constantColors for each SetTexture, so I get the first constantColor (bright blue in this case) for both texture stages. Anyone know if this is supposed to work?

From my experience with fixed function pipeline in the past I would assume that setting the color for the pass actually pushes it into the geometry as you can not recolor the pixel as they go through the pipeline.

so if you need different ones you would to my understanding have to handle them in distinct passes so you get distinct vertex colors. (thats what passes technically are anyway … rerendering the mesh with the new state)

Evidently it’s supposed to work, since you’re allowed to set a constantColor per SetTexture, and it renders correctly in the editor. It compiles without complaint when deploying.

What I’m asking is whether the fact that it doesn’t work on iPhone is a bug or a feature.

I don’t know what your trouble is. I just tested and two ConstantColors are combining just fine. (Mind you, I’m using Unity 3 beta 3 and I only have my iPhone 3GS here to test with.)

Full shader would be appreciated; I can’t even tell what you’re doing from what is above. (I assume you’re multiplying the constants in later? And you only want lighting on part of the model??)

Also, are you sure there’s no way to use vertex colors to do your multiplication? You won’t be able to blend three textures with vertex alpha - only two - but if your transitions are abled to be modeled instead of textured, I can see this happening in one pass on the newer devices and two on the old.

I’m targeting 3G. It may not be a problem on newer phones.

I guess the shader I posted was a bit obscure, here’s what I’m doing: I’m blending three different solid colors in one pass by lerp’ing the first two in the first SetTexture, and lerp’ing the last one with the resulting color. Then there’s a second pass which handles shading which I didn’t post because it’s not relevant to the problem.

So, two of the colors are represented by constantColors, and the third one is “primary”, i. e. the diffuse shading result, which in this case is equal to _Color0 because I set Emission to _Color0 and Diffuse Ambient to zero.

Shader "LayerShader"
{
   Properties
   {
      _Color0 ("Color 0", Color) = (1,0,0,1)
      _Color1 ("Color 1", Color) = (0,1,0,1)
      _Color2 ("Color 2", Color) = (0,0,1,1)
      _Layer1 ("Layer 1", 2D) = "white" {}
      _Layer2 ("Layer 2", 2D) = "white" {}
   }
   SubShader
   {
      ZWrite On

      Pass 
      {
         Material 
         { 
            Diffuse (0, 0, 0, 0) 
            Ambient (0, 0, 0, 0) 
            Emission [_Color0] 
         }

         ZTest Less
         Lighting On
         Blend Off

         SetTexture[_Layer1] 
         { 
            constantColor (0, 1, 0, 1) 
            combine constant lerp (texture) primary 
         }

         SetTexture[_Layer2] 
         { 
            constantColor (0, 0, 1, 1) 
            combine constant lerp (texture) previous 
         }
      }

      // Second pass that handles lighting etc
   }
}

Yep, you’re right. I tested it on my iPod touch 2G, and only one of the ConstantColor’s is respected. It’s not the first one, though. It’s the last one. You can just leave off one of the ConstantColor commands, and just use “constant”, by itself, and you’ll get the same result. (This is not currently emulated in the 3.0 Editor.)

Do you want to log a bug, or should I? I can’t do it until this afternoon. You can submit the .unitypackage below yourself if you want. Otherwise I’ll do it in about 8 hours. I doubt that this is fixable on the device, but it should be stated in the documentation, and the graphics emulation should take it into account.

That’s not efficient. You could just use Color [_Color0] instead of a material block.

Regardless, this shader is not good for you on your target devices. Can you answer the last paragraph from my previous post? (You also didn’t answer the question about multiplying; you described what your pass does, but that was clear from the code.) And if that’s an option, actually post your shader? We may be able to come to a solution.

345028–12057–$constantcolor_bug_185.unitypackage (8.78 KB)

Reported as Case 365290.

Jessy: thanks for reporting the bug, and thanks for the heads-up on the Color statement! Using vertex alpha for blending is not an option unfortunately, because the mesh is quite low poly; all the detail is in the textures.

I also think this is just an undocumented limitation of the older iPhones, so I ended up going “well, if I’m doing three passes, I may as well blend four layers instead of three and get more bang.”

So this is the final (complete) shader, which blends four solid colors using three alpha-maps, and then applies a grayscale diffuse map and shading:

Shader "LayerShader" 
{
    Properties 
    {
        _DiffuseMap ("DiffuseMap", 2D) = "white" {}
        _Layer1 ("Layer 1", 2D) = "white" {}
        _Layer2 ("Layer 2", 2D) = "white" {}
        _Layer3 ("Layer 3", 2D) = "white" {}
        _Color0 ("Color 0", Color) = (1,0,0,1)
        _Color1 ("Color 1", Color) = (0,1,0,1)
        _Color2 ("Color 2", Color) = (0,0,1,1)
        _Color3 ("Color 3", Color) = (1,1,0,1)
    }
    SubShader 
    {
        Material {
            
        }

        ZWrite On

        Pass {
            Color [_Color0]

            ZTest Less
            Lighting Off
            Blend Off
            SetTexture[_Layer1] { combine primary }
            SetTexture[_Layer1] { constantColor [_Color1] combine constant lerp (texture) previous double }
        }

        Pass {
            Color [_Color2]

            ZTest Equal
            Lighting Off
            Blend SrcAlpha OneMinusSrcAlpha
            SetTexture[_Layer2] { combine primary, texture alpha }
            SetTexture[_Layer3] { constantColor [_Color3] combine constant lerp (texture) previous double }
        }

        Pass {
            Material { Diffuse (1.6, 1.6, 1.6, 1) Ambient (1, 1, 1, 1) Emission (0, 0, 0, 0) }

            ZTest Equal
            Lighting On
            Blend DstColor SrcColor
            SetTexture[_DiffuseMap] { combine primary * texture alpha }
        }
    } 
    FallBack "Diffuse", 1
}

That looks like the blending would be unpredictable (in particular the whole second pass looks complicated); do you have some logic for how you’d actually paint the masks for that?

Aside from that, your colors don’t need an A component if they’re not going to be used; stuff in the Material block is especially useless here because it doesn’t actually have an effect - it’s always 1. Emission can be taken out if you’re actually going to leave it hard-coded black. Your first texture stage is unnecessary, also. The file’s extension is .shader, and it has a shader icon in the Editor, so I don’t see a need to put “Shader” into the name.

There’s no way I can see to avoid three passes for what you want to do. However, you could bring the number of textures down to two. I don’t know what you’re doing elsewhere; I assume you’re using the RGB components of these textures on other objects. (Otherwise, why would you be using a greyscale lightmap?) I’d probably recommend using three textures, not two, so that you could do this all in one pass on the newer GPUs. But without further knowledge of what you’re doing with the textures, I’d go this route, using the same SubShader for both 3GS+ and older, because memory is an issue on the old hardware.

346127–12068–$4_color_layers_360.shader (1001 Bytes)

Thanks for the heads-up on the superflous texture stage.