UPGEN Lighting is an optimized for maximum performance framework that aspires to provide a fully dynamic approximation of Global Illumination to your games (requires no precomputations). Initially designed for VR games, this solution provides high execution speed, produces great visual results, and uses a minimum of system resources. Emulation of indirect lighting could be made as fully automatic, in a semi-automatic way, or by manual designing light to save extra performance.
Under the hood, the solution is based on the idea of making an ultra-fast type of point light sources (10 times faster than standard ones) and using thousands of them to approximate light propagation for the scene in realtime. To achieve this we made a special kind of screen space effect, which bakes each frame hundreds of most valuable lights and applies combined lighting inside G-Buffer for the current camera.
This framework provides several workflows for building lighting in your scenes. You can use such Fast Light sources to automatically transfer bounced light from standard light sources using Ray-Tracing. Or you can combine a standard type of light sources with Fast Lights to manually approximate GI in complex scenes and achieve the best performance and a beautiful visual look.
Now you can try windows standalone demo (see link in the description). Package now is on approval, so hopefully soon it will be available on asset store.
All features are fully runtime, especially GI.
All parts of framework can be recompiled without crashes during playing in editor without stopping it, including Ray-Tracing cache.
Yes, it does not use any light-mapping-type approaches where some lighting info is stored as part of the model - all it does it does in fully dynamic way in runtime or designtime inside its own internal data structures.
Only function of HDRP it use is Deferred rendering (as lighting is building inside G-Buffer).
Soon I am planning to make support of standard pipeline in deferred rendering and probably limited version for URP (with forward rendering).
Package now is approved and available on Asset Store !
Version for Deferred Renderpath of Standard Render Pipeline is in testing.
Version for URP is in development and will come a bit later.
Huh, this looks pretty cool. You wrote that for large outdoor scenes with a directional light, the semi-automatic workflow should be used. I didn’t really see that being described in the manual, how does that work?
Now you can use ray-traced GI with directional light only in small area around e.g. in front of cave entrance. When you see how lighting looks like in some particular area, you can remove ray-traced version and put manually Fast Lights to approximate it. Another way you can use script to make directional light follow your camera smoothly along XZ plane. Both solutions of course are far from optimal and hopefully in future fully automatic way will be implemented.
I am a sucker for GI and given Unity’s current GI state, I m super interested. However, it is very hard to get a good idea of what this is - it is great that you included a standalone demo, but it is near impossible to actually see what is going on as the environment is very static and too simple. Good enough to prove something is working but not enough to make me decide. The information provided is just too little and walled off.
I have a few questions
When you say “ray-traced”, does it work on pre pascal cards? Is it getting accelerated by RTX cards?
Based on the post above, I am assuming this is more or less for approximating GI for a small area, correct?
How would this work for a procedural level? I couldn’t find detailed information about it.
How would it be implemented in an open world first/third person level? Is it even feasible to begin with?
Sorry for the questions, but I am stoked…so please, help me
No need of RTX, it can run on any card. For sure it was tested on almost all NVIDIA and AMD cards from last two generations but in theory this should also work on mobile and consoles. Ray-tracing is based on physical colliders, not by real meshes wich are too heavy to do this in realtime.
Yes, this is not a real GI, it is very-very rough and approximate imitation (but also very fast and fully run-time). For sun light it works in small area around, for point and spot lights - well they ware made to cover small areas by their nature. But any huge world is just a set of places where you can setup lighting as you need using this three main types of light sources (no limits by world scale and amout of lights). By my idea this not just a GI library - this asset is a framework, that should take care about all key aspects of lighing in your games, from doom-like corridor based level to huge terrains with caves or even space scale scenes.
It will be a perfect solution for procedural levels. It just does what you expecting (if you provide colliders to your geometry). During last 2 years on this technology we made more than 10 VR games including ones with procedural worlds and complex terrains.
Yes, there is no problem, you can do what ever you want, it should work as expected in almost any conditions.
Sure. Thanks for this questions. Later I will improve documentation according to them. So fill free to ask as many questions as you have =)
Hi
“Only HDRP and Standard piplines are supported (does not work with URP pipeline)” see first post.
I get some errors is I import it in a Standard RP Project, that HDRP is missing.
Is only HDRP currently supported?