Hello, I don’t know if it’s a shaderlab question but here it goes. I’m making 3d GUI elements and I use rendertextures it get it on screen. Problem with this is that the blending seems to work different opposed to normal rendering. You can see it in the attached picture. left is a normal render and right a RTT render. You can see the clearscreen through the sphere.
Has anyone have a solution to this? maybe a different blending mode or a way to change render order?
I don’t think you’re seeing the clear colour through the sphere. What that looks like is fringing produced by a lower mipmap level on the font texture. Are both your render targets using the same pixel dimensions?
Also, those are terrible colours to subject people to. Purer or less garish colours are always appreciated in examples.
As Daniel Brauer says, this was very difficult to see, but just a quick thought - assuming you draw the rendertexture to screen using GUI.DrawTexture, try setting the alphaBlend to false when drawing. Maybe the results will be a bit more similar then.
Hehe sorry about the colors. I tried to simulate a problem I had in our game but i failed horrably. So in the new shot you see the real problem. There are two tiny values rendered over the globe and there one blurry value witch I made bigger. And you can see through the alpha of the value through the globe. On the left you see the object rendered in the scene and that value does not make the globe see through. I hope this gives you better idea of my problem.
to tore tank - I need the alpha as you can see in the new picture
to daniel - you didn’t see the clear screen you where right the example was bad.
Ah, I think I know what’s happening there. The difference between RenderTexture targets and the screen buffer is that for the screen, the alpha channel is only ever used by other shaders. Your RenderTexture, however, is being displayed either using a GUITexture or the GUI class, which means that the alpha channel is being used for blending. The green bits you see are actually the terrain behind the RenderTexture showing through the parts with low alpha.
There are two ways you can deal with this. One is to make sure that everything you render has alpha == 1, which is a tedious process and prevents you from using the alpha channel for anything else (like full screen glow). The other way is to put something right in front of the camera that renders nothing to the colour channels, but 1 to the alpha channel. This shader on a simple plane, right in front of the camera will do the trick:
Shader "Alpha One" {
SubShader {
Tags {"Queue" = "Overlay+1"}
ColorMask A
ZTest Always
ZWrite Off
Pass {
Color (0, 0, 0, 1)
}
}
}
Actually, now that I think of it: an even easier solution would be not to use a RenderTexture at all. Since all you’re doing is drawing straight to the screen, you can use a regular camera with a higher depth and its clear flags set to Depth Only. Use the Normalized View Port Rect to make it draw only to the lower left-hand corner of the screen.
Thanks for the reply, I’ll go and try your suggestions.
About rendering with another camera. We tried it but we need to get GUI.DrawTextures behind it but I don’t think that is possible, am I right? If that is possible everything would be solved.
@Niels: Ah, I understand. Right, my suggestion will be of no use to you, obviously
@Daniel Brauer:
I’m not entirely sure what you mean here - do you mean filling the entire rendertarget with a=1 (in which case no background will be visible), or only the rendered parts (such as the planet)? Unless I’m completely off here, I think the problem is that the alpha value of the “2” (or the other numbers) are blended onto the rendertexture just as the color values are, so for areas where the alpha is low on the “2”, it will actually overwrite(/blend with) the existing alpha of the underlying planet. I think your approach could solve it if the planet was drawn at a later stage and writing 1 to the alpha channel only - as your shader does. That, or render the camera twice, but the second pass render only to the alpha channel additively - which should have the same effect as doing a blendmode that does regular blending but chooses max(source alpha, dest alpha) - not sure if that exists as a native blendmode already.
I’m sorry if this is what you already said, I just wanted to know if I got your idea
@Yorick:
I’m more tempted to think that the problem is overwritten/replaced alpha values written by objects drawn on top of opaque objects. I could be wrong though.
I think we found a solution and what it does is take the destination alpha and put it back. and it seems to help. So just render a plane in front of the camera like daniel did but with this shader.
Shader "RTTPost"
{
SubShader
{
Blend DstAlpha SrcAlpha
Tags {"Queue" = "Overlay+1"}
ColorMask A
ZTest Always
ZWrite Off
Pass
{
Color (0, 0, 0, 1)
}
}
}
I’m now going to test it further. Consider it “fixed” for now.
Can you explain why this works? It looks to me like it just doubles the destination alpha, which would make the image more opaque, but not necessarily fully opaque in all the places you want.
What I think it does is multiplying the destination alpha(the alpha that’s already rendered) and multiply that source alpha which is one(this changes absolutly nothing), and render that back to the texture. It looks like it’s reassuring “The Alpha” that he really is “The Alpha”
Again I’m not fully sure if it solves it but it definitely looks that way. I will report back if it’s not true.
A good test would be to have some black text in front of a black sphere, and put the RenderTexture in front of a white background. I think the background will show through a bit at the edges of the text.
Yes you are right. I miss calculated the blending. So it’s not fixed but it does give a better output then we had. The bugged alpha ranges between 0.9 and 1 so that’s gone when you multi it by two. This result is good for us for now but I would really like to solve this.
What do you think, is it a bug. Or just the way the hardware handles it, or are we doing something wrong?
I think it’s a bug, but unfortunately if it were fixed you wouldn’t be any better off. I don’t think alpha blended textures should render to the alpha channel at all, which would mean that although your text would work fine when backed by the globe, but it would be entirely transparent when directly above the background.
I can’t actually think of a 100% correct way to do this, but at the moment even getting your geometry to render again in a later pass with the shader I originally supplied would reset its alpha.
I don’t think it is a bug, well, unless “missing feature” is considered a bug. I’m thinking something exposing the functionality of http://www.opengl.org/sdk/docs/man/xhtml/glBlendFuncSeparate.xml might do some good. Not that I ever tried that in “raw” OpenGL myself, but from the description it sounds like something useful in this case.
I just mean the part about transparent things writing to the alpha channel. Especially in the case of GUIElements and the GUI classes, I don’t see how that could be useful, and there are a few cases where it makes things difficult.
In case I wasn’t clear, I think the initial alpha channel is wrong, even for the parts that are have nothing behind them. This is because the alpha is effectively applied three times: once when blending the initial colour, once to itself as it is blended into the RT alpha channel, and then again when the RT is alpha blended as a GUITexture. The result is very strange, and difficult to control.