Image Effect: Edge Detect Normals Colours [rel]

That’s not a problem with the image effect, that’s a problem with the two sided shader you’re using. Or more specifically is a problem with how Unity generates the depth normals texture when using the forward rendering path, which requires manually editing a built in editor shader.

Switching to deferred should fix the issue, or using a depth only outline image effect (which this image effect doesn’t allow you to select, it’s the Sobel option in the original Unity image effect), or by adding a double sided pass to the internal-DepthNormalTexture.shader

1 Like

I have tried to adapt the Internal-DepthNormalTexture.shader - by adding “Cull Off” into the opaque subshader, but it did not help. I guess it might be not exactly what you meant.

But the deferred mode and Sobel option worked.

What render type is the two sided shader using?

Some are using opaque, others alpha-test.

If you’re directly modifying the shader in the editor folder you have to delete your project’s shader cache for it to take effect. If you add the modified copy of the shader to your assets folder and set it as the override in your graphics settings then it should work with that change.

would it be possible to assign different colours to object based on tag and or layer?

I would just like to modify this effect shader to outline negative normal areas i.e. to detect where normals change from positive to negative values.
I have tried to modify the CheckSame function in EdgeDetect shader:

        half CheckSame(half4 center, half4 sample) {
            half2 centerNormal = center.xy;
            float centerDepth = DecodeFloatRG(center.zw);
            half2 sampleNormal = sample.xy;
            float sampleDepth = DecodeFloatRG(sample.zw);
           
            // difference in normals
            // do not bother decoding normals - there's no need here
            half2 diffNormal = abs(centerNormal - sampleNormal) * _Sensitivity.x;

            half ndot = dot(centerNormal, sampleNormal);//!!!!!!!!!!!!!!!!!

            int isSameNormal = (diffNormal.x + diffNormal.y) < 0.1;
            // difference in depth
            float diffDepth = abs(centerDepth - sampleDepth) * _Sensitivity.y;
            // scale the required threshold by the distance
            int isSameDepth = diffDepth < 0.1 * centerDepth;
           
            // return:
            // 1 - if normals and depth are similar enough
            // 0 - otherwise

            return ndot>0? 1.0 : 0.0;//!!!!!!!!!!!!!!!!!!!!

            //return isSameNormal * isSameDepth ? 1.0 : 0.0;
        }

But after these modifications the shader doesn’t show any outlines at all. Please give some hint if you can.

What exactly are you trying to test for that the original could not? Also you’re testing the encoded normals which are values in 0…1 ranges, thus a dot product will basically always we greater than zero. You need to decode the view normals before you use them in a dot product, and even then it’s going to be be fairly infrequent for that dot product to be negative.

half__3__ centerNormal = DecodeViewNormalStereo(center**);**
half
3
sampleNormal = __DecodeViewNormalStereo(__sample**);**

1 Like

Yes, that works great.
But I am missing the viewing direction - dont’t know how to get it at the moment.
Basically I need to check the dot(viewdir, normal).
I guess it could be a matter of simple calculation, basing on texture.uv and it just equals to view space in the middle of the view (0.5, 0.5) and then diverges towards the edges.

can you merge outlines of two objects together with this?

@pmurph03 thanks a lot!!!
that exactly what I was looking for
small questions - is it possible for edges to fade with distance from camera? (and possibility to tweak that distance)
thx!

any way to control detected edge thickness?

2 Likes

Is there a version that doesn’t require ImageEffects?
Also - antialiasing?

1 Like

Is there anyway to change color and stack colors?

Is this still usable somehow in modern versions of Unity and on Universal Render Pipeline? “Unity Image Effects” no longer seems to exist.

1 Like

The same base techniques are still usable, but organized in new ways:
https://alexanderameye.github.io/outlineshader.html

2 Likes

Does anyone know of a way to achieve this effect with HDRP?

I have two questions for this method. I want to use it in vr project. First problem is anti aliasing. Lines are flick too much.I could not find any solution for this. Second problem , It is add everywhere lines, UI included. How can we assign specific object. Maybe It should assign only layers. Otherwise, It is working great. If this problems solved , It can be best solution.

For HDRP you have to make this a custom post process.
https://docs.unity3d.com/Packages/com.unity.render-pipelines.high-definition@7.1/manual/Custom-Post-Process.html
The approach used above is the legacy Built In Rendering Path image effect method … because it’s from 6 years ago and predates even the current BIRP Post Processing Stack stuff which is similarish to what the HDRP uses.

Yeah … that’s going to be a hard nope.

Some VR games do use outline methods similar to this (the PC version of Echo Arena for example), but you have to use TAA at the same time to get clean outlines. And Unity’s TAA is … a bit rough to use with VR, especially as there’s no (easy) way to disable jitter which you don’t really want to have on for VR TAA.

Yep. That’s the explicit goal of this specific implementation. Everything gets and outline. You cannot avoid that. If you have UI elements that are rendering in the opaque queue, then they’re going to get outlines too. Again, you cannot avoid this, both because of how this particular effect is written and because of how Unity handles post processing.

If you want outlines that work well with MSAA, you cannot use a post processing effect like this that makes use of depth and normals because the camera depth and normals textures, and those are do not (and cannot) use MSAA. That’s not something Unity supports doing at all.

If you want just an outline, you might want to look into using old school inverted hull outlines. Like this:
https://github.com/chrisnolet/QuickOutline

For Falcon Age I used a post processing based outline that works a bit like this asset:
https://github.com/cakeslice/Outline-Effect

For the Quest version of Falcon Age any kind of post processing was way, way too slow. The QuickOutline above probably would have worked, but I instead used a technique that renders each object 4 times slightly offset in screen space to produce an outline. This was more expensive than using an inverted hull, but still not as expensive as I feared it could be, and let it work on any mesh without any kind of additional setup step like the QuickOutline does. But I don’t have an example shader / project of that to share.

These won’t do the interior edge lines though, only the exterior outline. The only way to do that nicely for MSAA is to draw bespoke “edge” lines as geometry, or bake it into custom per mesh textures.

Thanks for answer. After I try them , I see that It is difficult to decide which method is suitable. We are working on quest game too. I need to select cheapest one. I will look at your other advise too.