Dynamic, real-time global illumination support in Unity 6 or Unity Next Generation

Is there any chance that Unity will get any kind of dynamic, real-time global illumination support during the Unity 6.X release cycle?

By dynamic and real-time, I mean a system that doesn’t require pre-baking data in the editor but works dynamically at runtime or at least supports on-demand baking during runtime (for example when procedural scene generation is completed or partial rebaking when something in the scene changes).

This could either be an extension of the Adaptive Probe Volumes system or a separate feature. The main question is: Can we expect developments in this area for Unity 6.X? Or maybe any potential changes/features will skip Unity 6.X and occur in Unity Next Generation?

Thank you.

3 Likes

An APV solution has been on the roadmap for some time but it would be nice to know the status of it.

https://portal.productboard.com/8ufdwj59ehtmsvxenjumxo82/c/1939-precomputed-realtime-gi-dynamic-apv?&utm_medium=social&utm_source=starter_share

1 Like

It’s been disappointing that we haven’t had many updates on this for years and years. I saw this thread the other day, shows how the demand is still there, and that proposed solutions are confusing to people. LooperVFX ✦ on X: “@THE2FUN @muhamm real-time GI is a blurry spectrum between baked & real-time tradeoffs. I think you mean how Enlighten RTGI allows lights to update real-time (but environment geometry cannot move at runtime.) Unity will have this, and a full real-time solution. it appears @pigselated is on it :slight_smile: https://t.co/tFTWiYgWtz” / X

We’re continuing to explore our approach to dynamic gi, and Dynamic APV has been part of this exploration. At this time we haven’t yet decided if we’ll be able to include dynamic gi developments in Unity 6.x.

3 Likes

During development of Dynamic APV we found several major problems with the underlying light transport algorithm. From the user’s point of view, this means that the feature may be painful to use and/or that it generates unexpected lighting results. We’ve had both engineers, tech artists, and designers evaluate it and they all found problems. Therefore, we are almost sure that releasing it in its current state will cause more frustration than value for our users and for ourselves, especially if you consider that we’ll need to support it for years.

Therefore we are, as Steven points out above, continuing to explore our options on this matter.

1 Like

When you mention ‘Dynamic APV,’ are you referring to offline baked probes that would work with dynamic lighting (such as sky and lights) at runtime? Or are you referring to some broader extensions of APV for both geometry and lighting such as fully dynamic probes or ability to bake/update probes in player etc.?

Thanks.

I am referring to the regular APV probes which are placed during a precompute phase, whose lighting contents are then updated at runtime. In other words: Static geometry, dynamic lights + sky.

(Of course, we all want fully dynamic everything. The reason we have been exploring the static geometry case is that such solutions have a much much wider device range support than any fully dynamic GI solution.)

Why not use a GI solution that doesn’t rely on hardware ray tracing, and is already proven to work across a wide range of devices? It simplifies things while still giving developers the flexibility to choose between APV and fully dynamic GI, depending on their game’s needs. I’d love to see one of these solutions integrated into Unity, and I’m sure many other developers would also welcome it. Some examples include:

  • SDFGI (Signed Distance Field Global Illumination)
  • DDGI (Dynamic Diffuse Global Illumination)
  • VXGI (Voxel Global Illumination)

Are there any examples of these working on mobile?

Updating at run time for destroyable buildings/environment is sorely needed.

Really only hardware RT does this at the moment, none of the mentioned methods do it from what I’ve seen.

Full dynamic GI solutions is part of our evaluation, and indeed such solutions can be achieved without hardware ray tracing. But I don’t believe it is as simple as you make it out to be. DDGI has a leaking problem that is non-trivial to solve (FWIW, we collaborated with NVIDIA on a Unity DDGI implementation some years ago). Godot recently switched away from SDFGI because it had problems (I believe you can see this in their 4.3 release). In addition, the methods you mention usually have quite low spatial resolution compared to other solutions and that might not work well for some games. I am not saying DDGI/SDFGI aren’t great, only that IMO it is an over-simplification to say that they are “proven to work”.

That said, I think we can agree that there are interesting solutions/techniques out there (EA GIBS, AMD Brixelizer GI, H-Trace, TrueTrace, Kajiya renderer, Godot Voxel Dynamic GI, Tencent Smart GI, Radiance Cascades, Lumen, etc.) and we are indeed aware of them and taking inspiration from them.

It simplifies things while still giving developers the flexibility to choose between APV and fully dynamic GI

As mentioned above, I agree that many people want fully dynamic everything. But we have many users to cater for and while some users are OK with paying a performance penalty for full dynamic GI others may rather want semi-dynamic GI at a lower performance cost. Of course we could just build “both” but naturally our resources are limited so you may end up with to mediocre solutions instead of one great one. This is especially true for something like GI which are usually relatively large systems that take many man months to produce at high quality (compared to say, an SSAO effect). It’s tricky! :slight_smile:

While I cannot currently say much about what Unity will or will not ship, you may get a feeling for the options we are considering by taking a look at my personal twitter profile which shows videos of many prototypes and demos (some of them are not made with Unity but there is usually no reason they couldn’t).

11 Likes

There are several examples of raytraced realtime GI running without hardware RT. Examples include:

  1. Godot’s SDFGI.
  2. Godot’s Dynamic GI.
  3. Lumen running in “software mode” (using SDF ray tracing).
  4. Tiny Glade.
  5. Tencent SmartGI.
  6. Pretty much all of my own personal prototypes :D.
7 Likes

Thank you for responding and informing us about this topic. Regarding the “over-simplification”, I was just addressing your colleague’s point about the challenges of implementing Dynamic APV. I understand that such an implementation is complex, but currently, we’re missing the GI necessary to enhance our game visuals, especially in procedurally generated scenes. I’ve also encountered several issues with APV, such as light leaking and small illuminated areas that shouldn’t be lit (e.g., floors, walls…). Also, lightmapping isn’t an option for me and my game due to seam issues with modular level design, and APV is problematic for the reasons mentioned above. Additionally, SSGI has not delivered the expected results.

However, I am glad to hear that you want to provide a robust, fully dynamic GI solution, and I am eager to see its implementation soon. I have also checked your personal prototypes, and they look impressive, I would love to see such a solution implemented officially in Unity.

1 Like

I’m aware of tiny glade, but examples I’ve seen of lumen and destructible environments showed remnants of the pre destroyed lighting. Specifically, The Finals, but maybe that’s a bad implementation. :thinking:

I presumed it’s cause it can’t rebake the SDF at runtime?

I haven’t seen much of godot’s solutions.

1 Like

In case you missed it, HTrace v2 is just waiting for Unity to approve the asset.

I think it’s worth evaluating. Only directional light and emissive for now though.

1 Like

Thanks, I already know about the improvement in V2. Pavel (the author) and I have been talking a little on twitter about it. I agree, it looks very interesting.

1 Like

I know this has been solved, but I’m confused about your comment about DDGI.

In what cases did DDGI leak? I thought it’s whole point was to have a 16x16 octahedral encoded depth map that would let you figure out which probes are visible by each fragment you’re trying to shade.

How is APV better at preventing light leak than DDGI is? AFAIK APV doesn’t use store any kind of depth, and uses manual masking instead to resolve leaks on really thin walls. Wouldn’t DDGI give slightly better result?

I’m trying to learn a bit more about GI solutions, both dynamic or not.

In what cases did DDGI leak? I thought it’s whole point was to have a 16x16 octahedral encoded depth map that would let you figure out which probes are visible by each fragment you’re trying to shade.

You are correct, that is the point of the depth map. This method works but only partially. There are still situation where it fails and when it does it can be difficult to deal with. I cannot tell you exactly what cases this fails for, that is part of the problem. It can be relatively hard to predict where the method the fails. The method is based on the maths used in the Variance Shadow Mapping technique and this suffers from similar problems.

To be clear, I never suggested that no-one should use DDGI or that it never works. I just pointed out that it has leaking issues (like other methods have other issues) that in some cases can be a problem.

How is APV better at preventing light leak than DDGI is?

It is not. DDGI is indeed better – and it should be because it is way more costly than (the current) implementation of APV – both in bandwidth usage and in memory storage (DDGI methods take more samples and store more information than APV). APV was intentionally designed to be fast first and foremost, to work well on a broad range of devices.

Coincidentally, a colleague of mine may be investigating the possibility to add DDGI-like leak mitigation to APV for an internal Unity hackweek. This should mitigate leaking but will have higher runtime costs (which should be a suitable trade-off for some games).

3 Likes

I know your colleage!! I was literally just talking to him about it before I saw your reply today.
Indeed looking forward to seeing if there’s any cheap mitigation that can be applied on top of APV! Maybe there’s a way to automatically block off certain edges between probes. Of course this would break the trilinear sampling thing that APV got going on. Maybe the cursed dual point/linear sampling technique used in tiny glade works in 3D and could allow for both sharp transition between proves and linear sampling in other areas… Who knows what’s possible…

Say hi to your colleague for me hehehehehe