Feedback request: Global Illumination changes with the 2023.2 beta tech stream

Global Illumination changes with the 2023.2 beta tech stream, we want to hear your feedback (survey is now closed)!

Hi all, following on from our request for feedback with the previous tech release, we now want to hear about your experience using the changes to global illumination we delivered with 2023.2 beta. We do this so that we can make sure we are providing you with the best products for your day to day experience in Unity.

If you have experience with using the global illumination changes delivered with the 2023.2 beta release, please help complete the survey linked below (survey has been extended to close October 27th, 2023):

  • This survey is now closed, thanks for your responses.

What’s changing with the 2023.2 release?

Unity’s Lightmappers

GPU Lightmapper - Out of Preview

The GPU Lightmapper allows for much faster baking of lightmaps and Light Probes as compared to the CPU Lightmapper. This baking backend will help with the iteration speed of baking in your projects, specially when larger scenes, larger number of probes or higher resolution of lightmap textures are involved.

We have removed the preview label for the GPU Lightmapper in 2023.2, making this lighting baking solution an officially supported feature.

Unity provides a “Baking Profile”. This can be found in the Lighting window when using the GPU backend in on-demand mode, and offers users a tradeoff between performance and GPU memory usage when baking lightmaps.


The “Baking Profile” can be found in the Lighting window when using the GPU backend in on-demand mode

With this improvement, we have removed fall back behavior from GPU Lightmapper to the CPU Lightmapper. Instead of silently falling back from GPU Lightmapper to the CPU Lightmapper, now the bake process will stop and provide a clear Console message to explain why. However, with the lower memory consumption from the Balanced baking profile, we expect the number of failures to be significantly lower.

Note that for light baking some Scenes will simply not fit into GPU memory. This can become noticeable when processing large Scenes with many objects, dense geometry and/or using many high resolution textures, for instance for transparency. In these cases the CPU Lightmapper can be used instead.

Auto-Mode is now removed and now an Interactive GI Debug Preview Mode

Iteratively authoring and troubleshooting baked lighting data is an important use case for creators using baked Global Illumination (GI). For this reason, we have added a new “Interactive preview” functionality to various GI-related Scene View Draw Modes.

When entering one of the relevant view modes, a contextual panel will appear in the scene view, letting the user enter interactive preview mode. This feature works similarly to the Auto Generate lighting mode (which has been removed), but is completely non-destructive, and will not affect any existing baked lighting data. This allows our users to experiment while troubleshooting baked lighting, without having to do a full On Demand rebake after each change, overwriting existing baked data in the process.


Auto Generate has been replaced with an interactive and non destructive preview in the scene view debug modes

New default Lighting Data Asset for newly created Scenes (replacing Sky Manager)

Since the 2019 release, Unity has provided a system for automatically generating baked environment lighting in scenes that haven’t been baked explicitly. This is used in Built-In and URP, and is known as the SkyManager. We noticed that this system was causing confusion for our users, as the automatic behavior wasn’t very clear, and was only present in a few specific situations. On top of this, the system caused differences in the behavior of the Editor and built Player, sometimes leading to the environment lighting being unexpectedly missing.

We are simplifying the behavior by removing the SkyManager for Built-In and URP. To replace it, we’ve embedded a new default Lighting Data asset into the editor, which will be automatically assigned to newly created scenes. The asset contains environment lighting matching the default settings for environment lighting. If you change these settings, you will have to manually rebake lighting using the “Generate Lighting” button in the Lighting Window (this command is now assigned to hotkey Ctrl+Shift+L).


Before: no LightingData asset was assigned by default. After: a LightingData asset with correct data is now automatically assigned when creating a new scene

Probe tools have been adapted to the Standard Tools API for UX Consistency

An inconsistent user experience in the editor can break our users’ flow of creation. Where possible, tools in the editor should behave consistently to reduce cognitive load.

As part of a wider effort to create a consistent user experience between various tools in the editor, this feature addresses consistency of workflows for editing Light Probe Groups. The previous inspector-based editing workflow has been replaced by an Overlay in the scene view, which can be accessed through the main Scene View Toolbar while a Light Probe Group is selected.


Before: the Light Probe visualization tools were located in the Lighting window. After: tools are now located in the Scene view menu

Movable LightProbes.positions

Creators often build modular content for their projects using multiple Scenes. These scenes are then loaded and repositioned at runtime. Previously, when building modular content including Light Probes, creators were unable to reposition these together with their Scene, because the positions of baked Light Probes were read-only.

This feature provides creators with an API that allows them to modify light probe positions for specific scenes after probes have been baked. Check the LightProbes.SetPositionsSelf documentation for a starting point on how to use the API. This API only applies to probes baked using Light Probe Groups and not Adaptive Probe Volumes.


Multiple clones of a few baked template scenes being additively loaded. Here, probes are being translated to new positions at runtime.

Adaptive Probe Volumes

HDRP Streaming Data from Disk

Light probe data doesn’t always fit in runtime memory, especially in large environments - this prevents creators from fulfilling their vision. Without disk streaming, the CPU memory footprint of all probe data in large scenes may be too big to fit.

This feature will enable creators to build more ambitious games with larger light-probe lit environments that are streaming to the runtime from disk. It pulls probe data from disk in time to be used for runtime, and can now be found as an option on the HDRP assets to enable disk streaming per quality level.


An option ‘Enable Disk Streaming’ is now available in the HDRP asset

URP Per-Vertex Quality Setting for APV

Previously, APV supported only per-pixel quality indirect lighting. This may be unsuitable for a range of mobile devices, as it can lead to APV running below levels of acceptable performance at runtime.

With per-vertex quality settings for APV, we enable creators to set a per-vertex quality level for indirect lighting from light probes that enables them to efficiently run light probe lit environments on mobile devices. The trade-off for higher performance with per-vertex quality APV may be lower frequency indirect lighting as compared to per-pixel quality.


Per-Vertex sampling is now available for Probe Volumes. It can be useful for trading performance over quality depending on the geometry complexity

Note that the following limitations for APV in URP still apply:

  • Lighting Scenarios Blending is not supported
  • Lighting Normalization for Reflection probes is not supported
  • Performance on mobile may still require further optimization

Find the feedback request for the 2023.1 global illumination changes here .

7 Likes

Hi all, we’ve extended the close date of the survey to 27th October, 2023. Please provide us with feedback if you have experience with using these Lighting related features delivered in the 2023.2 beta release.

Is APV per vertex a per scene setting, or is it more granular like per game object / mesh / material?

Hi, i have problem with APV. When i bake small or bigger scene with more density probe spacing(0.25,0.4), baking is in the loop. Why?

Second question. When i use SSGI in ray tracing with APV, scene looks off. With Ray marching looks better but i have more leaks in scene. It is bug or by design?

The settings are renderer level right now, so not per object or material. I asked about something similar to this before. It may happen some time in the future but unsure.
That said, it looks like the per-vertex/L1/L2 stuff is all done in the shader passes, so perhaps it is possible with some custom shader.

1 Like

SSGI uses probe volume as fallback; RTGI will overwrite prove volume data.

Is there a way to get the new interactive gi preview outside of the other debug views? It is great to see changes interactively in the baked lightmap view but it would be even better if we could view it in the default lit mode as well.

1 Like

Hi there, I must confess, I don’t bake lightmaps myself, however, I tested the systems over the years and I would like to share some of my thoughts on it:

  1. Lightmapping settings. I would split the lighting panel Scene category into multiple baking categories:
  • Environment Probes Baking - bake the global ambient and light probe (I guess they are always available and used as fallback)
  • Reflection Probes Baking - rebake all reflection probes in the scene
  • Lightmap Baking - Keep these settings separate from the APV settings because most of them don’t apply to the APVs actually
  • Realtime Baking - shouldn’t this be deprecated by now?
  • APV / Legacy Probes Baking - Add new settings specific to probes (I’m not even sure what those are form the lightmapping settings)

All of the above should have buttons to bake them separately because there is no reason I cannot bake APVs separately to use them for specific objects only.

  1. Mesh Renderer settings. I would add more options here:
  • Contribute GI - allow separate control for what the objects contribute to

  • Contribute to Lightmaps GI

  • Contribute to Realitme GI

  • Contribute to Light Probes GI

  • Allow objects to receive lightmaps or probes regardless of how they contribute to GI baking or even none. This way we can bake a dummy/custom scene to APV and small objects can sample those APVs (even if not accurate)

9432509--1322558--upload_2023-10-26_14-50-46.png

  • Expose the actual Lightmap and coords offset. This would allow to add of objects to prefabs or replace the lightmap (not sure if possible now)
  1. Lightmap Baking Sets - Add gameobjects to a baking set and bake them to that lightmap only, automatically adjusting the scale and offset to fit them in that lightmap or to specify the number of light maps you want for that specific baking set and the LM resolution. Allow baking these sets separately!

As an example, I want my terrain in the Terrain Baking Set at 2k res, and I want all my buildings in 4 x 2k lightmaps in the Buildings Baking Set.

  1. Adaptive Probe Volumes - Where is the marketing team now? I saw more than 10 posts about Cross Fade being added to URP (in 2023, the year) meanwhile there are 2-3 videos on youtube regarding APVs!
    Anyway, I posted my feedback in the APV forum, so I will just copy-paste it here:

APV should be decoupled from the scene, and just be a regular component holding the light probes which are saved to disk, similar to reflection probes. They could be moved around, scaled, added to prefabs, etc. The light scenario could be on the volume itself, so you can blend between different lighting locally. So basically, make it modular and scalable.

All of the above could work as components in the scene, the global reflection and light probe are regular reflection probes and light probe/apvs in the scene, lightmap baking sets are components holding the objects needed for baking, APVs are components the objects sample from if they are in the volume…

That’s all from me, maybe the team will find something useful in this feedback :slight_smile:

3 Likes

This will make probe volume much more expensive.

You already can move the probe volume in runtime: Feature Request - Prefab Level Generation - Light Probe Support - Unity Forum

1 Like

I’m not sure what would make it more expensive, since the runtime structure would still be scene wide, it’s just the method for storing the data on the disk that changes. As for the light probe positions api, that only affects legacy lightprobes, and has nothing to do with APV.

That’s the limitation of Volume Based Probe, all probes must be placed regularly.

Hi all, looping back here, your feedback about the changes we delivered in this beta release is much appreciated! We’ll now close this survey.

Thanks again!

Hi i see the Lightmap and LODs page has been removed from the manual in this version. I have a question specifically about LODs.

Its been asked a few times ( https://discussions.unity.com/t/767992/4 , https://discussions.unity.com/t/807192 ) but having different LOD levels share the same lightmap space would be hugely beneficial to many games. We can probably hack something together for this, but an official solution would be great.

Is there a documentation on APV for URP available yet ? From what I saw its only for HDRP? Have an issue where apv is not baked, so I want to check docs if I do something wrong…but I cant find it on URP docs (be it 16 or 17 urp)

My advise is to not bother with APV on URP yet. You wont be able to bake it since there are so many points of failure. Once it supports baking on a per cell basis, this may change.

Honestly, this is exactly why Unity needs to make a complete game. These types of issues would be very apparent in such a case, whereas their small test scenes will never catch this. But I guess they would rather release a cool feature even if it is useless and collects dust while we wait for basic functionality.

Can’t say I’m all that surprised. This is the same company that doesn’t understand why the number of textures needing to be sampled would matter on mobile devices (their “new” sample scene devs are clueless about full game mobile optimization. To the point my jaw literally dropped. smh).