Blending Textures on iPhone

I want to blend two or more textures in Unity, but since I found out that the Unity terrain system does not work on the iPhone I have do make my texture blending some other way.

Here is a picture of my level in Unity with Unity terrain, I want to blend the debris texture over the road and sidewalk textures.

I’m not sure whats the best way to do this and I really want to make this level optimized. I could just make static tiles with differnt gound texture patterns on it, but then I would be using a lot more textures. Is there any way to get some kind of texture blending working on iPhone using just simple 3D planes of some kind? Maybe some custom shader or something?

I’m really new to Unity and any info would help.

Thanks.

if you want to make it optimized, bake the texture in the modeller and apply it or write a pixelshader and as such support 3GS+ only devices or create a model for the debris which you overlay and where the debris itself fades out through vertex alpha.

I fear that are the options you have.

There is “simple texture blending” through shaderlab but it won’t help in a case you require 3+ textures (base map, debris, alpha that contains the mixing data) as that requires 2 passes ie 2 drawcalls of the mesh making it totally unoptimized.

I think your best bet for optimization is to do all of the blending of textures in the modeling program and then bake it all in.

Thanks for the info guys.

With the alpha, are you talking about making a transparent fade around the edges of the model or? I read that transparent stuff takes a lot of power to run on iPhone and should be avoided if possible?

With alpha I mean indeed fade to alpha 0 at the outer borders of that “splat overlay model”.

As for costing a lot of power: discussable. alpha blending is faster than alpha cutout for example.
The problem comes from fillrate limitation on the 4th gen devices and ipad where overdraws can easily kill your whole performance. for that reason it might potentially be best to have 2 meshes here, one for the completely filled area that does not use alpha nor a shader with alpha support and a much more limited mesh that just covers the outer area and uses an alpha there for example so the part of overlay is as small as possible.

or you focus on 3GS+ and use a pixel shader in which case you don’t have to worry about this (but have to create a shader potentially)

I’m making some progress here with layered textures in Maya. I exported it as FBX and you can see in this image I did that it transfers over to Unity. But I can’t find a shader that works with “dual UV diffuse”. I quess thats all I need to make this work?

I used a particle shader called Alpha Blend and you can see that the two UV Sets are indeed there.

what you will need is either a second model thats below with the base texture or to prevent z fights, take the shader you used there or better its sources, add another pass in front of the one that does this blending, which does nothing but render the base texture “below that alpha layer”

If you post shader questions in the Shaderlab forum I am more likely to see them.

Shader "Sprite/Decal" {
	Properties {
		_MainTex ("Background", 2D) = "white" {}
		_Decal ("Decal", 2D) = "black" {}
	}
	SubShader {
		Tags { "RenderType"="Opaque" }
		ColorMask RGB
		Pass {
			SetTexture [_MainTex] {
				combine texture
			}
			SetTexture [_Decal] {
				combine texture lerp (texture) previous
			}
		}
	}
}

Thanks but that shaders did not work with the setup I did?

In this image I made 2 UV sets then painted the vertex in the alpha channel using the paint vertex color tool.
http://www.billyhallden.com/images/layered_texture01.jpg

If anyone is interested here is the tutorial I followed to create the layered texture in Maya
http://www.youtube.com/watch?v=sdI3ddGFYJg

I got it working in Unity using this shader by Jessy I found in another post, but I have no idea if this works on the iPhone?

Shader "2-Part Vertex Blend" { 

Properties 
{ 
   _Color ("Main Color", Color) = (1,1,1) 
   _SpecColor ("Specular Color", Color) = (1,1,1) 
   _Shininess ("Shininess", Range (0,1) ) = 0.7 
   _Emission ("Emmisive Color", Color) = (0,0,0) 
   _Texture1 ("Texture 1 (white Alpha)", 2D) = "" 
   _Texture2 ("Texture 2 (black Alpha)", 2D) = "" 
} 

SubShader 
{ 
   BindChannels 
   { 
      Bind "vertex", vertex 
      Bind "texcoord", texcoord 
      Bind "color", color 
      Bind "normal", normal 
   } 
    
   Pass{SetTexture[_Texture1] {Combine texture * primary} } 
    
   Pass 
   { 
      Blend One One 
       
      SetTexture[_Texture2] {Combine previous Lerp(primary) texture} 
      SetTexture[_] {Combine previous * one - primary} 
   } 
    
   // Same as using Double, with SeparateSpecular Off. 
   Pass 
   { 
      Blend DstColor SrcColor 
       
      Material 
      { 
         Ambient [_Color] 
         Diffuse [_Color] 
         Specular [_SpecColor] 
         Shininess [_Shininess] 
         Emission [_Emission] 
      } 
      Lighting On 
   } 
} 

}

the shader should work on the iOS but its very inefficient actually as it will draw the mesh 3 times.

If you can avoid it cut the colors used in there so you can get rid of the 3rd pass by combining it down into the first pass.

interesting is that you used 2 UV sets, cause in that case you would need to bind them for the second tex coord

Here, I wrote this for an iPhone game I’m working on. It works on iOS perfectly, and from what I’ve gathered from testing it’s better than transparency.

Shader "BergerBytes/Two Texture Blend" {
Properties {
    _Blend ("Blend", Range (0, 1) ) = 0.5 
    _T1 ( "Texture 1", 2D ) = ""
    _T2 ( "Texture 2", 2D ) = ""
}
 
SubShader {
    Pass {
     
        SetTexture[_T1]
        SetTexture[_T2] { 
            ConstantColor (0.5,0.5,0.5, [_Blend]) 
            Combine texture Lerp(constant) previous
        } 
        
    }
}
 
Fallback " VertexLit", 1
}

-Mike Berger
-BergerBytes.net

interesting approach bergerbytes.
actually that one could be modified (getting rid of constant color and add an alpha equation to the combine)

Don’t use ColorMask on iOS unless necessary; it’s actually slower than not using it.

Really? :face_with_spiral_eyes: We sure do have a similar style, then. There’s no point in a Fallback when you have a shader like that on the iPhone.

I don’t think you need all of that. It looks like you just need what Daniel Brauer gave you, except using verts instead of a texture to blend. This was written for 100% separation between textures, but works just fine for gradated blending too.

So which of the these two would be best for iPhone?

Daniels

Shader "Sprite/Decal" {
	Properties {
		_MainTex ("Background", 2D) = "white" {}
		_Decal ("Decal", 2D) = "black" {}
	}
	SubShader {
		Tags { "RenderType"="Opaque" }
		ColorMask RGB
		Pass {
			SetTexture [_MainTex] {
				combine texture
			}
			SetTexture [_Decal] {
				combine texture lerp (texture) previous
			}
		}
	}
}

Jessys

Shader "Separated by Vertex Alpha" {

Properties {
	_MainTex ("Texture 1  (white vertex A)", 2D) = ""
	_Texture2 ("Texture 2  (black vertex A)", 2D) = ""
}

SubShader {
	BindChannels {
		Bind "vertex", vertex
		Bind "color", color
		Bind "texcoord", texcoord
	}
	
	Pass {
		SetTexture [_MainTex]		
		SetTexture [_Texture2] {Combine previous Lerp(primary) texture}
	}
}

}

The one Daniel posted does not work with my model atm, but the one Jessy posted does.

So i am still learning the finer points of shaders and performance optimization, so I may be way off here (or just missing something all together), but what is the purpose of using the shader in the first place? Why not blend, then bake the texture in Maya, then just use the one single texture instead of trying to blend two with shader? I have used this technique before in some testing i have done on the iphone and it worked pretty well. And it seems to me that one texture with no shaders would be more performant than two with a blending shader. I am just curious if you are trying to accomplish some other effect that I am not seeing, or my assumptions about shader performance on the iphone are just way off.

I could do static texture tiles for street and terrain but then I would have to do a hole system of terrain tiles and that would be a lot more textures.

Then again blending like we are talking about here means more tile models with layered textures or something else. That was one of the questions in my original post.

That’s because you’re blending with vertex alpha and not texture alpha.

Vertex colors have benefits and drawbacks. Vertex colors are resolution-independent, so won’t get fuzzier when you get closer to them, like textures. They also won’t make the RGB of a compressed texture look worse, like using alpha in a texture will. However, they’re more of a pain to deal with; good painting tools like Photoshop don’t exist, to my knowledge, for vert coloring. Also, in your case, you’re not using a lot of polys, but if you needed more intricate shapes, a grid would be a very bad idea, increasing the poly count dramatically in order to achieve necessary detail. So then you have to model shapes into a plane, which is a nightmare in terms of dealing with topology, again, using the tools I know of.

The iPhone is limited in both ability to process polygons and memory. In particular, the old models are very limited in memory. So there’s a balance to be stricken. Your solution seems good for what you’ve shown already, but I don’t know where you’re going with it.

I just wanted to take the opportunity to thank Jessy for all the great work he puts in to the Unity dev community. If you want to learn anything about shaders then you MUST check out Jessy’s excellent ShaderLab video tutorials here: http://www.youtube.com/user/JessyUV#p/c/31F6A116DCCC9F61

So, thanks Jessy, you’ve saved me countless hours of frustration. Cheers!

A bit of a thread hijack there :p, but you’re welcome! :slight_smile: