I have been working on a project where I would like to have as many lights as possible. Since using the Unity default Point Lights seem to be really expensive I have decided to look into creating custom lights.
Fortunately I found this implementation Unity Blog which seemed to fit with exactly what I wanted although its from 7 years ago.
I tested it with a ton of lights and the performance was great, so I decided to adapt it to my project. It uses two command buffers one to render the actual light and another for the flare.
Walls with decals in the border
The only issue I found was that it didn’t affect the Decals that I have in my project. Therefore I decided to change when the lighting buffer is aplied to CameraEvent.AfterEverything. But then I found some weird issue that I cannot really understand. When the lights have a high intensity the decals somehow fade away.
Wall with light on top
As you can see the light value doesn’t gets added to the decal and instead gets added to the floor and makes the decal fade away. Why does this happen? Is it because the custom light shader reads from the depth texture and somehow is unable to render the decal? I don’t really get it because since its rendered AfterEverything shouldn’t it add the light color to the color that already has the decal addded? My decal shader is a surface shader with decal:blend so I guess it shouldn’t have anything to do with it.
Frame Debugger
If any of you have any idea on how to do it I would greatly apreciate it since I have not found many info regarding this matter.
Using a bunch of Unity point lights was expensive even when you forced Deferred rendering mode to be used?
By default the rendering paths are doubled up whereby Opaque objects are rendered to deferred with 1 lighting pass and semitransparent objects are rendered Forward with a pass per-light.
If what you were trying to render before was rendering in the Transparent or later RenderQueue then it would be understandable why the performance would tank. Decals do not need to be in the transparent Queue, because they are meant to modify Opaque objects generally, utilize the GBuffer Normals/Depth to determine how they should project. So forcing it to run at the end of the “AlphaTest” Queue will allow you to avoid needing to use the more light-expensive forward path.
In the case of CommandBuffer, you want them to be rendered CameraEvent.BeforeLighting . The decal then outputs colors, normals, emission, spec, etc… to the Deferred GBuffers to modify what is actually there, so that when the Deferred screen-space lighting pass runs after, however the decal modified the GBuffer will be seen by the lighting calculation and thus adjust how the lighting/render looks.
Yeah, using Unity’s Point Lights was terrible for my performance. After reading your post I decided to check my decal shaders and changed all of them to work as you sayed by modifying the GBuffer. And I gotta say it really worked, the performance is better and the Custom Lights work on the decals. I decided to stick to the Custom Lights because they still have much better performance than the default Point Lights. I will keep testing to see if anything is wrong but it seems like its working.
The main issue I had is that I thought using Unity’s decal:blend with a surface shader in the transparent queue was better than using a shader that writtes into the GBuffer in the Geometry+1 Queue. Guess I was completely wrong on that. Many thanks for your help because I was completely lost on this.
Glad ya got it working! Looks good!
You may want to shift the queue up to AlphaTest+549 (2999) so you hit the end of the AlphaTest queue. As Geometry and AlphaTest are both Opaque/Deferred, AlphaTest is primarily just meant for Opaque meshes that will clip()/discard away parts of the mesh (which is also how you fake transparency in Deferred by using a dithered clip() driven by alpha value and then further enhanced with jittering and temporal or general screen-space anti-aliasing to smooth the result).