shader problem - affecting other materials..?!

So, a while back I got some help with a transparent water shader (without refraction/reflection - indie license for now).
It works well, and since it’s using a gradient for the alpha it looks pretty good for what it is and allows me to use some old-school tricks like objects below water to fake reflectivity and so on.

Anyway, the weird part is that the shader affects other objects than the one it’s applied on. If I have an object with an alpha (diffuse+alpha) over the water plane, it too will become fully transparant at certain angles. It doesn’t follow the fade-in/out caused by the fresnel-gradient in the water material either, it just switches on/off.

If I disable the water shader or switch it to the standard (indie) water material, the problem goes away. So it seems this shader is affecting more of the drawing than just the water plane.

I have no experience with writing shaders myself, so I can’t really pinpoint the problem myself.
Any help would be much appreciated!

Here’s parts of the code, I can post it all if need be:

// -----------------------------------------------------------
// ARB fragment program

Subshader {
	Tags { "Queue" = "Transparent" }
	Blend SrcAlpha OneMinusSrcAlpha
	ColorMask RGB
	
	Pass {

CGPROGRAM
// profiles arbfp1
// vertex vert
// fragment frag
// fragmentoption ARB_precision_hint_fastest 
// fragmentoption ARB_fog_exp2

sampler2D _BumpMap : register(s0);
sampler2D _ColorControl : register(s1);

half4 frag( v2f i ) : COLOR
{
	half3 bump1 = tex2D( _BumpMap, i.bumpuv[0] ).rgb;
	half3 bump2 = tex2D( _BumpMap, i.bumpuv[1] ).rgb;
	half3 bump = bump1 + bump2 - 1;
	
	half fresnel = dot( i.viewDir, bump );
	half4 water = tex2D( _ColorControl, float2(fresnel,fresnel) );
	
	half4 col;
	col.rgb = lerp( water.rgb, _horizonColor.rgb, water.a);
	//col.a = water.a*2+0.2;
	return col;
}
ENDCG
		SetTexture [_BumpMap] {}
		SetTexture [_ColorControl] {}
	}
}

// -----------------------------------------------------------
// Radeon 9000

Subshader {
	Tags { "Queue" = "Transparent" }
	Blend SrcAlpha OneMinusSrcAlpha
	ColorMask RGB
	
	Pass {

CGPROGRAM
// vertex vert
// just define 'vert' as a vertex shader, the code is included
// from the section on top
ENDCG

		Program "" {
			SubProgram {
				Local 0, [_horizonColor]

"!!ATIfs1.0
StartConstants;
	CONSTANT c0 = program.local[0];
EndConstants;

StartPrelimPass;
	SampleMap r0, t0.str;
	SampleMap r1, t1.str;
	PassTexCoord r2, t2.str;
	
	ADD r1, r0.bias, r1.bias;	# bump = bump1 + bump2 - 1
	DOT3 r2, r1, r2;			# fresnel: dot (bump, viewer-pos)
EndPass;

StartOutputPass;
 	SampleMap r2, r2.str;

	LERP r0.rgb, r2.a, c0, r2;	# fade in reflection
	MOV r0.a, r2.a;
EndPass;
" 
}
}
		SetTexture [_BumpMap] {}
		SetTexture [_BumpMap] {}
		SetTexture [_ColorControl] {}
	}
}

Wild guess: I think what you’re seeing is not the water affecting the objects; it’s water “occluding” the objects. The shader you are using does write into depth buffer, so it will occlude all objects that are “behind” the water (and that are drawn after the water is drawn).

I’d suggest adding

ZWrite Off

to the subshaders, for example after ColorMask RGB lines.

Thanks for the reply!
After reading a bit I did start to suspect that it had something to do with queuing.

However, is it really possible to solve the problem if it is related to Z-buffering, as I do want the water to be able to occlude other objects - but in a correct manner? The problem I have is solely with objects using an alpha shader, ordinary diffuse materials aren’t affected.

The shader in question is actually the one posted by…I think it was you…in response to a question about transparent indie water. Since I have no experience whatsoever with shaders, I would be very thankful if you would give an example of how to insert that code if in fact it will solve the problem?

I’ll give it a whirl, but I fear my computer will melt once I start tampering with the shader code… :slight_smile:

TIA!
Dan

Ok, so you want the shader to be transparent, but also occlude objects that are below it? That are a bit conflicting goals :slight_smile:

Hehe - I want it to be partially transparent, which it is. :slight_smile: However, I don’t want it to affect the transparency of transparent objects that are in front of it - which it does. :wink:

(I have the waterplane using the shader, and a mesh plane above it with a diffuse+alpha material - that mesh/material is being affected by the water shader in some way).

I’m guessing it’s related to which order the objects are being drawn in, but I have no idea how I can affect that…?

Oh, forgot to mention:
I think I added ZWrite Off in the right places, but it made no difference… :cry:

Ok, the hard part is (and the reason why builtin shaders are not partially transparent by default): it’s hard to get transparent objects “right”. Ever noticed how all games generally avoid transparent stuff, except for special effects? Yes, it has to do with sorting.

Transparent objects (the ones in “transparent” render queue) are drawn starting from furthest one. Currently the sorting is based on object’s bounding box center; so if you have a small transparent object and a big transparent water plane, their center points could be in any configuration, even if small object is actually always “in front” of the water.

Now, “what to do?” is the hard question. Several possible ways:

  • Maybe you don’t need transparent objects besides water? :roll: In some cases objects don’t need to be partly transparent, and only have “cutout” transparency (think fences or tree leaves) - in that case it can be solved nicely.

  • Put water in transparent render queue, and turn off ZWrite. It should not make other transparent objects invisible; but it can appear that other objects are “behind” the water when in fact they are in front of it (and vice versa). Position the objects/water so that these cases are minimized and just live with it. Quite a lot of even big AAA games do that. Most players can’t properly tell 2D from 3D anyways - would they notice sorting issues? right?

  • Don’t put water in transparent render queue. This however can do other artifacs - for example if water happens to be drawn first, then you would see background color or skybox through it, instead of some ground underneath.

And of course, if you have a small scene that shows these problems, with instructions on when they appear - just send it via Report Bug.app, I’ll take look.

Exactly the case! A large waterplane with transparent objects in front. This would also explain why the effect differs based on the viewing angle rotated around the global Y-axis.

I want the partially transparent objects in front. Mostly because the look nice. :wink:
The effect isn’t that annoying if the water shader is set right, so I might get away with it.

A final question - will I get around this problem when I upgrade to Pro, or does the Pro water introduce the same problem?

Thanks for your help - especially on a Sunday! :slight_smile:

After a bit of fiddling around, I solved another problem I had been having with the transparent water!

The shader as it was seemed to change the point in the fresnel gradient used based on not only the angle to the surface, but the distance. Zooming out would give the water a different look to the one up close. Since I’ll be changing the camera altidue a fair bit, this didn’t look pleasant.

By some guessing and trial and error, I changed the following line

half fresnel = dot(i.viewDir, bump );

to:

	half fresnel = dot(normalize(i.viewDir), bump );

It seemed i.viewDir was a vector, and could be the reason the fresnel value changed as the camera position did the same. Normalizing it did the trick - although I’m not certain why. I would have thought it was normalized to start with, but I guess not.

Anyway, I now have a great-looking shader that behaves the same way no matter where the camera is (or how large the water plane is). :slight_smile:

That makes perfect sense - looks like a bug we missed.

I would recommend trying the normalization in the vertex program - so you normalize before setting the viewDir. That will perform a lot better!

I’m in deep water here (ha…ha… :wink: ).
I can’t seem to find where in the vertex program to do the operation to get a similar result:

v2f vert(appdata v)
{
	v2f o;
	float4 s;

	PositionFog( v.vertex, o.pos, o.fog );

	// scroll bump waves
	float4 temp;
	temp.xyzw = (v.vertex.xzxz + _Time.x * WaveSpeed.xyzw) * _WaveScale;
	o.bumpuv[0] = temp.xy * float2(.4, .45);
	o.bumpuv[1] = temp.wz;

	// object space view direction
	o.viewDir.xzy = normalize( ObjSpaceViewDir(v.vertex) );
	//o.viewDir = normalize(o.viewDir); // Tried this - didn't help.
	return o;
}

ENDCG

// -----------------------------------------------------------
// ARB fragment program

Subshader {
	Tags { "Queue" = "Transparent" }
	Blend SrcAlpha OneMinusSrcAlpha
	ColorMask RGB

	Pass {

CGPROGRAM
// profiles arbfp1
// vertex vert
// fragment frag
// fragmentoption ARB_precision_hint_fastest 
// fragmentoption ARB_fog_exp2

sampler2D _BumpMap : register(s0);
sampler2D _ColorControl : register(s1);

half4 frag( v2f i ) : COLOR
{
	half3 bump1 = tex2D( _BumpMap, i.bumpuv[0] ).rgb;
	half3 bump2 = tex2D( _BumpMap, i.bumpuv[1] ).rgb;
	half3 bump = bump1 + bump2 - 1;
	
	half fresnel = dot(normalize(i.viewDir), bump ); // new way
	//half fresnel = dot(i.viewDir, bump ); //old way
	half4 water = tex2D( _ColorControl, float2(fresnel,fresnel) ); 
	
	half4 col;
	col.rgb = lerp( water.rgb, _horizonColor.rgb, water.a );
	col.a = water.a;
	return col;
}
ENDCG
		SetTexture [_BumpMap] {}
		SetTexture [_ColorControl] {}
	}
}

Any hints? :slight_smile:
When I’m done with this I’ll probably understand shader programming as well… :smile:

It is normalized in the vertex shader; but not normalized in the fragment shader - and yes, it does change things a bit. In fragment shader (like you just did) looks nicer, but is slower; so there’s a tradeoff.

Hmm - it appears to already have been done in the vertex program.

this means that the sole difference between your version and the official is that your version has slightly higher precision (at a performance penalty).

How’s your water mesh tesselated? You should not see a huge difference if the tesselation is “ok-ish”.

I’ve used the standard water mesh and scaled it up quite a bit, so I guess that could throw the shader off a bit. Didn’t think of that actually as it was many moons ago since the last time that was a factor in a 3D animation/rendering package (which is my native home :slight_smile: ).

So basically I should be able to switch back to the faster/rougher way without normalization in the vertex program if I use a finer mesh?

Indeed - the better the mesh is, the smaller the difference.

Which brings us to the final, final question:
What incurs the bigger performance penalty?

  • a LARGE rough mesh, with the vertex program normalization
  • same size, but much finer mesh (1-2k polys) but the official shader with no vertex viewDir normalization

Again, many thanks for your help!

That depends on what hardware you’re targeting. Just normalizing in the vertex shader makes fresnel dependent on the viewer position a bit, just like you realized; and the larger the triangles in the mesh, the larger the difference.

Finer mesh is less difference, but more vertices to process (which is almost always a no issue… except on some integrated graphics cards).

Normalizing in the fragment shader makes everything “correct” and is not an issue on most graphics cards (but then you have things like GeForce FX5200 or Intel GMA950… these are slow).

Edit: …1-2k triangles mesh is not very fine, I think, so go for that. I think the standard water mesh is 700 triangles or so. If you’d had 20-30k triangles in the mesh, then it’s worth thinking about.

OK - so if I shoot for slower hardware (like Mac Minis and the Macbooks - which use GMA950 IIRC) I would be better off with a slightly denser mesh and the shader as is, since those cards deal better with more triangles than a more detailed shader?

I think I’ll have to make this an option in-game, as I still think the variant with normalization is slightly better… :slight_smile:

Resurrecting this old thread. :slight_smile:

Regarding the transparency sorting problems:

  • What if I know the exact order I want the semi-transparent objects to be rendered in? Can I force the sort order through code?

My problem is that I have my partially transparent water, and some semi-transparent objecs above it. I know the water will always be “below” the other transparent objects. Currently, it still sometimes looks like that’s not the case (due to the sorting going wrong - the objects are of very different sizes).

Can I force it?

Not easily… you could use custom shaders for transparent objects, that don’t use “transparent” render queue, but use “overlay” render queue (docs).

Alright! Will give it a whirl, even though it doesn’t sound like the perfect solution.

Thanks for your help! :slight_smile: