I was working with the shader graph about a year ago and developed some pretty nice assets with transparent effects via shaders. One of the things I noticed was that when I have two assets overlapping, the rendering of the two assets didn’t work properly. Imagine two panes of glass intersecting and at the intersection, one of the panes simply doesn’t render past that intersection (part of it appears missing).
So I googled the issue and did some reading up on this problem and it seemed to me at the time that the problem was considered a known issue in rendering transparencies in the industry. I did come across some solutions for Unity, but they deal with changing the order of the rendering of the materials/shaders. While that appears to alleviate the issue from a player’s perspective, it really doesn’t solve the problem, just shifts the problem from one set of transparent shaders to a second set. One set sticks out more so you get the sense you see the entire glass bubble. but you really don’t.
However, just today I was playing around with unreal engine 5.03 and I was just curious to see what they had to offer in the way of dev features and performance. In the Lyra demo project, the main indoor arena had aqua tinted glass everywhere. I took one and turned it and intersected it with another. I played around with their glass assets quite a bit. I couldn’t get it to fail to render properly.
I couldn’t force a bad rendering with UE5 and the Lyra’s glass assets and I just wonder how they render transparencies? Is there a more effective and universal solution with Unity than reordering the shaders for the glass panels of my bubbles?
I can tell you that with 2022.1.14f1 URP, I still see the problem quite clearly. One glass bubble intersecting the other, and I cannot see part of the first within the second. It simply doesn’t render as it should or would be expected to.
For alpha blending, polygons have to be rendered back to front. For performance reasons, most engines only sort the objects by their centers.
Even if you sorted all the individual polygons, you’d end up with situations that you couldn’t resolve:
Polygons that intersect would have to be split
Situations like this:
This means, the only solution for dynamic objects is to sort per pixel. This is called Order Independent Transparency (OIT), because you can render the objects/polygons in any order. However, it’s not trivial and comes with both a runtime and memory cost. You basically need to store all the shaded pixels in a list and then resolve the list in a screen-space pass. The more layers of transparency, the more memory you need.
Newer GPUs support a feature called Rasterizer Ordered Views (ROVs) which help to solve this problem. Alternatively, you can also render transparent objects in a lower resolution to reduce the cost.
Since no one else has touched on this, Epic added support for a few different levels of Order Independent Transparency in UE5 that may be why you couldn’t get bad sorting in Lyra. I’ve not looked at what they’re doing for me to be able to say concretely this is for sure what they’re doing, just some possibilities.
First, Epic implemented a “cheap” solution in the form of triangle sorting. Every single transparent triangle that has this option enabled is sorted independently allowing for much more accurate transparency rendering at the cost of some performance. But I don’t think this is what’s happening, because that doesn’t handle intersections at all unless the meshes are using ridiculously high tessellation.
Second, Epic implemented true OIT by having transparent objects be 100% ray traced. Ray tracing doesn’t have problems with transparency sorting even with intersections since it’s doing that sorting as part of the ray traversal already.
However there’s a third option. When using ray traced reflections it can hide a lot of sorting issues, especially if you’re using two sheets of glass that are the same color. The sorting might still be wrong for the diffuse or refractions, and correct for the reflections, and you probably wouldn’t notice. You can more easily check by having two only slightly transparent surfaces with different colors intersect. That should make it obvious if the sorting is actually correct or not. If it is, it’s ray traced. If it’s not, then you were being well fooled by content designed to hide it.
This was an interesting exercise. I changed the color of one panel to a blue and their opacity to 0.95 each and the colors were lightened up. I could see the ordering flipped as I moved closer. Thanks for the help in understanding what may be happening here. It gives me some ideas on how to make more effective glass prefabs in Unity.
I did originally show the screen shots, but because they are with the Epic assets, I wasn’t sure I should post them here, so I took them down. But they did show clearly one was rendered in front of the other though they did intersect, which was not correct.
So yep, that’s showing that the sorting is actually wrong. But the color of the glass and maybe some of Unreal’s reflection rendering systems, were hiding that.