It’s hard to make out on this example how exactly normals blend together, but I assume they overlay like in descriptions from the first posts - if so, then hell yeah that’s what I wanted. Except without albedo/spec/smoothness contribution (if that’s where the black fill is coming from), of course. How did you achieve that?
Yeah i add the color on purpose since kinda hard to see the effect :p.
Not sure if this is what you looking for but here’s the code.
Shader "Custom/DecalProject" {
Properties {
_Color ("Color", Color) = (1,1,1,1)
_MainTex ("Albedo (RGB)", 2D) = "white" {}
_BumpMap ("Normalmap", 2D) = "bump" {}
_BumpScale("BumpScale", Float) = 1
_Glossiness ("Smoothness", Range(0,1)) = 0.5
_Metallic ("Metallic", Range(0,1)) = 0.0
_Cutoff("cutout",Float)=0
}
SubShader {
Tags { "RenderType"="Opaque" }
LOD 200
CGPROGRAM
// Physically based Standard lighting model, and enable shadows on all light types
#pragma surface surf StandardSpecular fullforwardshadows alpha
#include "UnityCG.cginc"
// Use shader model 3.0 target, to get nicer looking lighting
#pragma target 3.0
sampler2D _MainTex;
sampler2D _BumpMap;
sampler2D _CameraGBufferTexture2;
struct Input {
float2 uv_MainTex;
float4 screenPos;
};
half _Glossiness;
half _Metallic;
fixed4 _Color;
half _BumpScale,_Cutoff;
void surf (Input IN, inout SurfaceOutputStandardSpecular o) {
// Albedo comes from a texture tinted by color
fixed4 c = tex2D (_MainTex, IN.uv_MainTex) * _Color;
fixed3 n = UnpackScaleNormal(tex2D(_BumpMap, IN.uv_MainTex),_BumpScale);
fixed3 GN = tex2D(_CameraGBufferTexture2,IN.screenPos.xy / IN.screenPos.w).rgb;
//clip(c.a-_Cutoff);
o.Albedo = _Color;
// Metallic and smoothness come from slider variables
o.Specular = _Metallic;
o.Smoothness = _Glossiness;
//o.Normal = BlendNormals(n,GN);
o.Normal = (GN+n);
o.Alpha = c.a;
}
ENDCG
}
FallBack "Diffuse"
}
Umm sorry i think i messed something with the latest code, i’ll try to get the correct normal buffer for the normal
EDIT: Okay fixed
btw just sample the existing Gbuffer if you want to use the existing albedo/spec.
Edit :
On second thought i don’t think there’s any easy way to do this except modify the buffer directly
@Reanimate_L
Okay, I’m not sure I understand some points fully:
- Why is this shader not suffering from GBuffer fetching issues? Every time I did it in a similar shader, half the time GBuffer returned black, or mangled, or containing a random texture. Is it because this shader is forward, not deferred?
- Why is output of this shader not present in the debug modes of the Scene view? Is it because they are rendered before forward shaders are rendered?
- What is the purpose of the alpha attribute and clipping variable if you are not using them at all?
- I’m suspect that normals are transformed incorrectly and/or added to the normal GBuffer incorrectly, because lighting on them is vastly different from lighting on the underlying surface and a flat tangent space normal is not actually being lit like a flat surface with that shader. I also suspect that any blending operation you’ll use over the normal GBuffer has to be wrapped into a normalize operation. The issue is probably this: this is a forward shader, so deferred rendering is not used at all, so the output struct expects normals in tangent space format, so you actually need to transform the content of second GBuffer to the tangent space of our surface before you can mix it with unpacked normal and output it to the standard struct.
- Even if I remove both textures and normal output and only output albedo from gbuffer0, spec from gbuffer1.xyz and smoothness from gbuffer1.w, there is still a slight discrepancy in the result: see how decal areas are slightly darker despite duplicating the gbuffer contents exactly. No idea how to fix this:
@Dolkar
As the example above is a forward shader with all associated performance and limitations, my question about generated code is still relevant - if you have the time to answer it, please do.
The RenderType should stay Opaque. What that tag controls is not the blend mode, but when the material is rendered in the pipeline and what replacement shaders are used (for shadows, for example). What you need to do instead is change the blend mode directly by adding Blend SrcAlpha OneMinusSrcAlpha inside the pass. As long as the alpha output is set correctly, that should work.
Oh damn! Looks like vert/frag conversion is not required at all, as everything seems to work with a surface shader! Thanks a lot
I applied your advice, keeping rendertype opaque while adding the Blend, but it still did not work, keeping all output completely blank for some bizarre reason. And then, by pure accident, I decided to remove the line multiplying output to emissive GBuffer (#3) by 0. And boom, everything started working!
I have absolutely no idea why, but tampering with output to emissive GBuffer seems to completely kill any output to normals, albedo and other buffers.
I have only one issue, though. Some of the UV islands never receive any normal mapping at all, as you can see from the screenshot above. I’m pretty dumbfounded by this. I double checked UVs and other stuff, it’s all grabbed correctly, otherwise those perfect islands, like a circular seam, wouldn’t work. What can actually cause normal output to stretch or become completely flat on some faces while at the same time an albedo output that’s using the very same uv_MainTex for it’s tex2d stays perfectly mapped?
Here is the current version of the shader:
http://hastebin.com/raw/kacakiremi
Okay, it seems to be fully working! I modified it to allow separately configured albedo, spec/roughness, normal and emission contribution intensity.
Here is full source:
http://hastebin.com/raw/itopeyakuq
I have a few questions left, though, so if @Dolkar , @Reanimate_L or anyone else has info on this, please share it:
- Where is ambient occlusion in the GBuffer?
- What exactly is GBuffer3 containing and what should I do with it in the finalgbuffer? I mean, I see that the documentation lists it as an “ARGB32 (non-HDR) or ARGBHalf (HDR) format: Emission + lighting + lightmaps + reflection probes buffer” that is “logarithmically encoded to provide greater dynamic range”, but I have no idea how to use it in the context of modulating the surface shader output. Multiplying it by values ranging from 0 to 1 has some extremely weird effects, for example drastically changing the visibility of albedo contribution, or making some of the albedo texture still visible even when albedo is completely killed in the finalgbuffer method.
Actually, I’m a bit confused about relationship between surface output struct components and four components finalgbuffer function can modify. Can someone explain how albedo output, metalness output and smoothness output interact, how albedo texture somehow ends up visible when nothing but emission variable is left visible in the finalgbuffer and so on? Here are a few screenshots with the different configurations of that shader.
Only the emission output is left:
Only the normal output is left:
Only the normal and emission outputs are left:
Only the specSmoothness output is left:
Only the specSmoothness and emission outputs are left:
All outputs are active (diffuse, specSmoothness, normal, emission):
Alpha and clipping atrribute in my shader are unused, i just forgot to remove it
sampler2D _CameraGBufferTexture0; // Diffuse RGB and Occlusion A
sampler2D _CameraGBufferTexture1; // Specular RGB and Roughness/Smoothness A
sampler2D _CameraGBufferTexture2; // World Normal RGB
uniform sampler2D _CameraReflectionsTexture; // Deferred reflection buffer
And it seems you already find the correct buffer blending
The emission value is where any self illumination or ambient lighting is stored. It’s also reused after the creation of the gbuffers as the render destination for lights. Basically the reason you don’t see it in the shaders that have _CameraGBufferTexture# listed is because what would be #3 is what they’re rendering to.
Can someone recommend a solution for blending the normals correctly? As far as I see, Blend SrcAlpha OneMinusSrcAlpha results in some weird non-normalized vectors appearing in the normal RT between alpha values of 0 and 1. Unfortunately, it’s not possible to get rid of them by making an alpha without gradients, as those vectors will still appear on some pixels due to texture filtering. Here are few examples:
And here is an example with a completely flat tangent space normal, illustrating how the issue happens even when there is no difference between decal normal and underlying RT normal:
Just in relation to the CameraGBuffer3, it’s borked in HDR mode at the moment. Referencing it in HDR seems to just give the diffuse G-Buffer texture (or something? from the documentation it seems like that’s how its meant to work in HDR, but I do not see the logic behind that, seems completely pointless with no use cases), and Unity will spit an error at you if you try and use it with BuiltInRenderTextureType.xxx
Hi
For the normal blend artifact, I was using directx’s slerp (spherical lerp) to blend, but that doesn’t seem to work anymore
@Undertaker-Infinity
Err, both lerp and slerp would have worked perfectly well with my flat normal example (second image), because lerp between two identical vectors would have yielded a correct vector the whole way through 0-1 factors. But the whole point is that I do not have any information about the normal previously occupying that pixel in the GBuffer, so I have no control over blending beyond setting alpha and Blend mode. Am I missing something, and there is a way to supply a blending function (ideally per-GBuffer) explicitly, which will allow me to use lerp?
Well, regular alpha blending is basically a lerp based on the alpha. So you’re using it already.
That’s not what I’m seeing in the second screenshot here. I’m outputting a flat normal there - shouldn’t a result of a straight lerp between existing GBuffer normal and new normal be a valid at any alpha value? If I output e.g. 0,1,0 and existing pixel contains 0,1,0, then result of the blending should be 0,1,0 even at intermediate alpha values. That’s not what I get at all, - as you can see in the GBuffer view on first screenshot, every pixel with alpha values between 0 and 1 gets a very weird, completely invalid “gray” normal.
That definitely shouldn’t be happening. It’s a weighted sum… if both values are the same, then the result should be as well, regardless of the alpha value. If they are different, though, then yes, the result won’t be normalized. (0, 1, 0) * 0.5 + (1, 0, 0) * 0.5 = (0.5, 0.5, 0). But that shouldn’t be a problem because I’m pretty sure the deferred lighting shader normalizes the normal map input anyway.
Could you make a quick image effect that displays the contents of the normal buffer to see what’s actually happening to them?
@Dolkar
Not sure if I need an image effect, there are scene view image effects that show the contents of all deferred RTs save for emission.
Hmm, I’m just noticing this now in an new environment lit with GI, but I have the same issue with very weird halos from alpha values 0.99-0.01 in albedo, spec and all other outputs too. It’s not just normals. It seems to be linked to the emission or at least at the very worst with emission, because emission turns completely wrong the moment you attempt to touch it in a finalgbuffer function.
Here is the finalgbuffer code for context for the sliders in the next gifs:
Here is how emission blends:
Here is how normals blend (again, to remind: the same issue happens with a completely flat normal map, and in the normal map used in this example, alpha gradient starts only in flat areas, so the edge artifact can not be coming from a difference in normals between source and destination):
Same deal with specSmoothness, although it’s harder to show in a low-resolution GIF.
Another strange quirk that might point someone to an answer, I guess: when a smoothness value I try to output is lower than smoothness of an underlying pixel in the GBuffer, I’m unable to overwrite it at all, my output just fades to background smoothness color. Pretty weird. Maybe someone will recognize that as an issue specific to some incorrectly set up blending, or something.
Have you tried other blend modes? One OneMinusSrcAlpha for example?
That’s exactly the mode I’m using, Blend SrcAlpha OneMinusSrcAlpha.
Blend SrcAlpha OneMinusSrcAlpha is common Alpha blending.
Blend One OneMinusSrcAlpha is often used for premultiplied Alpha blending. I suggested that mode since you are multiplying alot of values inside the finalgbuffer function.
edit: but then again I am not understanding that finalgbuffer function. I wouldn’t multiply normals - I would add them together and renormalize afterwards. Why it seems to work eludes me.
When our project has upgraded to 5.2 I would love to have another look at this decals stuff. Until then I can only lurk and learn from your work