Some years later I’m looking at this issue again, and wondering if there’s any available solution.
The issue is that I can never get transparent objects to behave correctly with Refractive objects. I seem to have three choices when it comes to Transparent objects. In the examples below, there’s a refractive red cylinder in the middle, with a green cube in front of and behind the cylinder. The two green cubes use the same material. The examples show how the rendering will either render the front cube properly, or the rear cube properly, but never both properly.
1. Rendering Pass: Default: This will correctly render objects in front of the refractive object, but objects behind the refractive objects will not refract. This is the least “incorrect” looking approach, but it’s effectively the same as not using any refraction at all.
2.Rendering Pass: Before Refraction; Depth Write: Off: This will correctly render transparent objects behind the refractive objects. But it will refract transparent objects in front of the cylinder.
3. Rendering Pass: Before Refraction; Depth Write: On: This almost looks correct. However, due to Depth Write being on, none of the cylinder is visible through the front green cube. (That’s a bit hard to see, but it’s pretty noticeable if you compare it to the first image.)
Ultimately, I can’t get a consistent approach when using Refraction.
Is there any solution to this? Or does this remain a limitation? I just don’t really understand how people are making use of Refraction in practice, when it comes with such a trade-off to rendering realism. I feel like I’m probably missing something. Thanks.
The solution which virtually every video game has used since the dawn of shaders is: punt.
Know which parts of your scene can have this sort of artifact, and then decide to make the situation impossible in normal play, unlikely in normal play, or ignore the artifact.
A game engine is not a raytracer. It does not have the luxury of considering and sorting every object in every pixel to ensure all light influences are accounted for. It must overlook as much as possible, to achieve the frame rate that makes live animation possible.
It definitely remains a limitation, once a transparent writes its depth you can’t really see anything behind it.
However, there’s two things I have in mind.
First, if you have the hardware and can afford the performance drop, using Recursive Rendering (i.e raytracing) will certainly help you render 3 layers of transparents without issues (there’s limitations but this might not be a problem for you depending on what you are doing).
Secondly, you could use the Compute Thickness pass and make a custom transparent shader in the shader graph directly (starting from an opaque lit or unlit) and try to see what’s possible from there using HD Scene Color node. There might be some artifact since all object will be in the scene color and in the depth but it might be a good start :).
As I think about it, in the near future there might be some samples to use as a starting point for this since it’s not exactly straightforward.
On 23.3 there’s also an option to have per pixel sorting between transparents and refractive objects
This can easily be tested with water (which behaves like a refractive object)