Compute Thickness pass Feedbacks / Support

Hi !
In 2023.1, part of the focus on HDRP was the improvement of transparents in general.

Among other tasks, HDRP now provides a fullscreen pass to compute the accumulated thickness for objects on a given LayerMask.

HDRP computes optical path and the overlap count (i.e the number of triangles traversed), which can be useful for instance for SubSurface Scattering or Refraction.

The overlap count can be used for flat or non-closed objects like vegetation.

There’s a few limitations when mixing Transparent and Opaque objects but it should work with any type of material.

This post is to centralize feedback and support for this specific features and transparency questions in general.

Documentation is coming but in the meantime, here’s a few tips to use this if someone is interested:

  • Compute Thickness needs to be enabled on the Default Frame Settings under Rendering foldout

  • Compute Thickness needs to be enabled on the current HDRP Asset

  • One or more layer needs to be checked under the Compute Thickness properties in the HDRP asset

  • One or more object has to be in the one of those selected layer for the pass to be filled with any meaningful data

  • Lastly, the result of the thickness pass is accessible via the HD Sample Buffer node in the Shader graph. The node requires the layer index as input

Default Frame Settings

HDRP Asset

Layer index

HD Sample Buffer node

The use cases:

  • More accurate Refraction and Absorption without having to author a thickness map.

  • SubSurfaceScattering/Translucency with no thickness map needed

  • For Vegetation and non closed objects, an overlap count can be used to multiply the thickness of leaves to have a fake accumulation for dense alpha clipped vegetation for exemple.

Varying the thickness of the leaves (see attached gif)

  • This can be used as well in more creative ways like sampling the thickness of objects in other layers than the one the object is to create X-Rays effects like.

This will be demonstrated further in an upcoming sample on transparency as well.

Limitations :

  • Does not support Tessellation
  • Mixing open and closed mesh on the same layer can cause negative values. For better results, use separate layers.

  • Mixing transparent and opaque objects can creates unexpected results due to when the transparent object are rendered in HDRP

Cheers!

8756182--1186621--OverlapCount_Leaves.gif

8 Likes

Can you talk any more about the underlying technique? What's the performance cost?

It add a simple fullscreen pass to compute the thickness of multiple object at the same time for each given layer. That sample the existing depth buffer to have the thickness interacting with the environement cf. the image of the green glass dragon interacting with the plane.

You have a control on the performance by controlling the resolution of this layer it can be full screen, half resolution or quarter resolution.
8760889--1187644--upload_2023-1-27_11-31-45.png

2 Likes

Do standard lit shaders support this feature or only custom shader graphs? It doesn't seem to work with standard materials no matter what I try. Also do materials in those layers that don't have transparency option enabled still render to thickness buffer? That would be excessive, because I want to use layers for other game features too.

For the buffer to be filled with the thickness of the object, the object needs to be in a layer and that layer needs to be selected in the HDRP asset list.
Then the only way to sample this buffer is via shader graph by using HDSampleBuffer node.
So, you don't have to use shader graph for object to write their thickness into the buffer, but you have to use shader graph to read that texture and do something with it.

And yes, it's compatible with transparent and opaque objects.

The best way to setup this would be to have a specific layer for object that needs to write their thicknesses and only use that layer for that and nothing else.

I mostly use layers for collision matrices, camera culling and other things (creating multiple layers for same purpose but with thickness is not an option). Wouldn’t it be a better way to have shader pass do the rendering into buffer, rather than layer? This would automatically select materials that have thickness boolean enabled similary to how receive SSR, decals works. Or this would suck up the performance?

That would be possible, but that will block us to split object on different layer. For instance we can split transparent and opaque on different thickness layers to avoid interaction issues.
We’ll think about it as an improvement to have an index reserved for material-based selection if possible.

Had a brief test run of the feature, works very well out of the box! It was easy to setup in shader graph, and I did not encounter any issues. Seems to even work properly in XR, unlike nearly all other features I tested which were introduced in the 2022 / 2023 cycle.
It allows much more dynamic thickness calculation which would not be possible with thickness maps.
Two thumbs up:)
Edit: And the performance seems really good, minimal impact on GPU time.

2 Likes

Can I calculate the thickness of multiple layers at the same time? Such as skin, bone and metal. This makes it possible to blend the final shape with different material weights.

Will URP have similar functionality? Or can give some reference document or url to help me achieve similar function in URP? I'm guessing that the depth of the back is rendered first, and then the thickness is obtained by calculating the difference from the depth of the front?

yes, if each material / objects are into a different layer, each layer will fill a separate buffer and you will be able to get the thickness of each separately :slight_smile:

Can’t really comment on URP although, AFAIK, nothing’s planned. And you are right, when you boil down the algorithm to its simplest component it’s basically subtracting backface depth from front face depth (thus why we have issues with open meshes… etc)

After playing more with it, I still find it very useable, and pretty fast! Two thing I found:

  • Thickness seems to produce invalid values sometimes. I need to saturate the values, but I guess this should already be done in the thickness pass
  • On a bit more complicated geometry sometimes getting bad results, e.g. on a character with teeth in the same mesh the teeth start glowing through the skin if mouth is closed. This can be corrected with transmission map or thickness remapping, but this is not always optimal.
1 Like

would be great if you can talk about your case, IIRC, it’s not satured on our side because we thought users might want to know if the value is negative for example, that’s an info that will be gone if we saturate it ourselves.

For this one, if you can provide a mesh that’s problematic, I’ll happily have a look :slight_smile:

Thanks for the feedbacks !

Our usecase is to caclulate thickness for SSS (for transmission); what’s the semantic meaning of a negative value? I thought this is some “error”, because how can a mesh have negative thickness?

For the problematic mesh: It’s a standard DAZ / Genesis 8 shape, which basically has everything in one mesh (eyes / teeth / head / body). I could do a bug report with a project about this, if it’s helpful (and considered as a bug). Could also be that the issue is the SSS, which is both used on teeth and skin.

Basically to have the per pixel depth, Compute Thickness subtract the depth of back faces with depth of front faces. As long as the mesh has even number of those front and back faces for each pixel, it’s fine.
When your mesh is “open”, you can have an odd number, you can end up with a negative thickness. That can also happen if your normals are not properly facing “outside” your mesh for exemple. See here for more details

You can send me directly the model, and I’ll have a look then, don’t bother with a report if you’re not sure.

@chap-unity helped me to debug the mesh, thanks for this! The causing issue is an open mesh for the teeth. My workaround solution is to a) saturate the output of thickness pass and b) remap thickness of Diffusion Profile, so that the Min Thickness prevents the shining through of teeth.

1 Like

On 23.2.0, rendering debugger shows thickness only if assigned to default layer, other layers always shows gray even if I enable compute thickness on them. Also I would like to know, why gameobject layers used instead of rendering layers?

I just tested this and it seems like it works.
Are you sure you actually selected the proper layer in the debug view as well ?

9482371--1333624--upload_2023-11-20_10-17-11.jpg

We could have used rendering layers as well. It’s just that RL are limited to 16 and are also used for lights and decals in HDRP and we can be limited very quickly so this is mainly why it has been decided to use GameObject layers for this

Sorry to barge in here - just want to mention that the problem with gameobject layers, is that it doesn’t behave like flags and you can only assign one layer to a gameobject - that can make it tricky when your gameobject is already assigned to a layer for some purpose, and now you need to use another layer for another purpose. I often run into situations where I’m torn between needing to assign an object to different layers.

I know this is a historic issue in Unity and it was probably a very old bad decision that can’t be corrected now. It feels like some additional 32-bit flags value might be needed for general purpose stuff in the future.

1 Like

Don’t be, it’s actually very valuable and it makes some sense. IIRC, there was another reason why we chose gameobjects layers (why the use of mainly in my answer) but I can’t remember why currently, I’ll raise this and see if anyone else remembers.