Unified Rendering

Hi,

In the past few years, while improving each render pipeline for its specific usage, we have been working on unifying more and more of the pipelines’ systems (eg: volume system, render graph, rendering layers, rendering settings,…) and bringing more parity between the pipelines in terms of functionalities for Shader Graph (custom renderer features and custom passes and post process) and VFX Graph (smoke lighting, decals, depth buffer and color sampling,…). On top of this, we worked on improving the coexistence of render pipelines, which highlighted all the limitations and the complexity of such workflows.

With the opportunity of building the next generation after Unity 6, it is time to simplify our rendering offering, while at the same time, building on the features and benefits that were added to the scriptable render pipelines.

First, we will remove the Built-in render pipeline to simplify our offering. Most productions shipped in the past year have used URP or HDRP, and many productions have already successfully upgraded from Built-in, enabling any combination of better production workflows, scalability, CPU/GPU/Memory optimizations, or improved visual fidelity (to know more you can check this « Best practices for migrating from Built-in Render pipeline to URP » webinar). Most users and studios we interviewed said that they got a lot of benefits from the upgrade and love the SRPs.

We know that upgrading a project can be time consuming or risky and should not be rushed. This is why the Built-in render pipeline will be maintained in the Unity 6 generation for a long time, giving you multiple years to plan your transition. If you are starting a new project, we highly recommend you using URP or HDRP.

Second, we are unifying the scriptable render pipelines. But what does that mean?

We are combining the best of what both render pipelines have to offer.

  • Shared rendering data (lights, cameras, probes, sky…) to author once and be able to deploy anywhere. The data model will be based on industry standards and advanced parametrization similar to what is offered in HDRP. This will allow you to author lighting using physical light units to more easily realistically light a mix of indoor and outdoor environments. You will still be able to author light intensities in an equivalent way as in URP or Built-in RP.
  • Manage rendering settings and the configuration and compatibility of features with devices, by integrating them into Build profiles to easily switch rendering configurations between target platforms. Offer a standard Lit shader based on OpenPBR proposing multiple scalability tiers to balance visual quality and performance.
  • Offer multiple pipeline configurations all running on the Render Graph architecture. We will offer specialized renderers to get the best out of different types of hardware architectures (eg: tiled mobile GPUs and PC/Consoles GPUs) while offering a unified customization and extensibility API.

We are planning as well to improve productivity and user experience for our tech artists tooling, with a new unified UX and backend shared across our node based tools (Shader Graph, VFX Graph and the new animation tools).

We are also working on improving shader variants management and compilation, in order to reduce build, loading and iteration times which can make you lose precious hours during productions.

Finally, we know that a lot of users enjoy creating shaders with code, and we want to make it easier for them. In Unity 6 and below, with Scriptable Render Pipelines, Surface Shaders are not available. While you can code shaders with ShaderLab, often reusing the render pipeline shaders provided in the pipelines packages, this can be error prone and complicate upgrading to a later Unity version. In the next generation, we want to offer the possibility to code shader blocks and assemble these blocks either in Shader Graph to allow other artists to add functionalities via nodes, or in the equivalent of a standalone Surface shader.

To sum up, we are:

  • leveraging and reusing the systems developed in the past years
  • unifying rendering data around more modern and powerful industry standards
  • taking the best of URP and HDRP inside a single rendering and customization framework
  • unifying and improving user experience for tech artists, coders or non coders
  • improving compilation and iteration times

With that, we want to offer a simpler, more productive, powerful and scalable graphics engine to push performance and visual fidelity on all platforms, while simplifying your ability to reach more platforms, and more easily reuse or share assets, tools and plugins.

You can find our Unite 2024 roadmap presentation on the subject here:

We hope you like the plan, and we will share more as we move along the development process.

As always, please share your feedback. There are still multiple unknowns and we might not be able to answer all questions at this stage of development.

37 Likes

Sounds good and much needed! My biggest questions for now: Will ALL current features of HDRP continue to exist in the new unified pipeline? And how will Shadows + Light calculation be done, like in URP or like in HDRP or selectable or in a completely new way?

4 Likes

Hi
Thanks for your effort it is really cool :slight_smile:

Do you plan to have on Unified way to write shaders?
I understand what block shaders will provide :slight_smile:
I want to know more about internals like:

  • depth buffer format => one Unified Reversed-Z buffer for all
  • camera relative rendering => for all
  • get rid of converting Y to -Y on different platforms => leave single Unified option for all platforms
  • may be make Compute shaders a hard requirement so every platform supported will have be truly Unified shader logic
  • may be get rid of OpenGL and OpenGL ES and minimal support in 2027 will be Metal, DX12, Vulkan and WebGPU
  • and other stuff in shaders that need to be authored few times because of platform/render pipeline differences and/or inability to setup graphics device in a single Unified way on all platforms
5 Likes

Hi!

While i understand that the whole reasoning for Unified, it all revolves around the same kind of lighting models.

But, is CustomLighting being considered at all in the tooling? Could we eventually be able to override the Fragment node to provide a custom lighting model?

I love the power of ShaderGraph, and the Block Shaders approach feels like wonders to me. But currently, if we want to be able to use ShaderGraph AND CustomLighting, it constantly feels like having to hack the tool so we can end up injecting our light model in some way.

It seems that on Unified, i would have to fight the default renderer even more, if i want to implement custom lighting.

There is always a way, and some asset store tools like Amplify would probably look to provide ways to support this. But i’d like to know if Custom Lighting is in the minds of the official Team (i’d love to be able to use the Block Shaders), or if we should probably keep relying on non-official tools.

Ty!

4 Likes

Will ALL current features of HDRP continue to exist in the new unified pipeline?

It is quite an effort to merge, so we might remove some features, and reintegrate some of them back in a similar or more scalable form.
We are still evaluating the exact list, so a great discussion to have.
The rational we have currently is:

  • Remove HDRP features which have a low usage/maintenance cost ratio (eg: LookDev window, Graphics Compositor, AxF shader,…)
  • Start with MVP HDRP features (Volumetrics, advanced lights and shadows,…) and add back more usage specific features (eg: water, volumetric clouds). Some we might not touch and will require some min specs, some we might make them more scalable.

And how will Shadows + Light calculation be done, like in URP or like in HDRP or selectable or in a completely new way?

We are planning to share a single shadow, camera and lighting backend and authoring mostly based on the more modern and powerful HDRP parametrization (volume based sky and environment setup and cascade shadow settings, more advanced parameters for lights/shadows/reflections/…). However, users should have the choice of authoring lights just using values like in URP/Built-in RP (fixed exposure, luminance based lighting) or with physical lights units like in HDRP (manual or automatic exposure + Kelvin/Lumen/Lux/EV/Nits). Some users prefer authoring one way or the other, and they should then be able to do so on a wide range of platforms.

8 Likes

Yes, that is the general idea!

Re. platforms min specs, discussing with lots of mobile studios, it might be too soon even for this next generation of Unity, to remove completely GLES support. These phones still represent a too large part of the mobile gamers. So we currently plan to keep support for at least GLES 3, but plan to have a clearer contract and better graphics settings to disable automatically non compatible features so that it is easier to scale down to these devices.

We would like to help the community get to a point where most people would just have to care about compute compatible devices and modern graphics APIs. This is why we have been and are continuing to invest even more strongly in Metal, Vulkan and DX12 since a few releases (lots of progress in U6 already!) to drive adoption, and doing so with our platform partners.

8 Likes

Custom lighting is at the top of our mind!
It is a core value of Unity that we have eroded with SRPs and that we would like not just to get back, but ideally get to an even better place than where it was before.
The idea of Shader Blocks is to allow to do custom lighting inside the master node and still use Shader Graph on top, or to use them as standalone, similarly to surface shaders.

Again, we still need to figure out many details along the way, but that is our ambition.

18 Likes

I could kiss you right now. Please let us know the moment feedback is required on this!

Also, is there any plans to output into HLSL files instead of only Shaders / SubGraphs ? This would mean being able to mix/match graph-based tooling with custom shader files.

(i see that my questions are more into tooling than the Unified per se so feel free to just say that this is not the propper thread)

8 Likes

I like everything I’m hearing so far. It might be a smaller feature but I especially appreciate the flexibility with the lighting units. Being able to choose whichever we want is great

3 Likes

I have been thinking about Unified rendering a lot lately, I have a custom fullscreen shader I use in my URP projects, that I wanted to try in one of my HDRP projects recently, and I have not been able to get it working yet. The workflows are so different! I can see how for things like lighting values you could offer users both the URP and HDRP intensity values as they’re more or less convertible between one another, but I have to wonder how they could preserve the simple workflow of URP’s fullscreen blit passes, while also keeping the extremely robust and complex system that HDRP uses for fullscreen passes :thinking: I guess they could just end up being two separate systems

Will camera stacking be a thing in whatever HDRP will be in this new system?

(I realize HDRP won’t really exist anymore (right?) in the Unified Pipeline… but I hope you know what I mean)

1 Like

I assume I will be able to crank up the settings (volumetric fog with raytracing set to insane make-my-pc-bleed) as well as turn everything down to make it look as basic(like downgrade to vertex lighting) as possible all during actual built runtime player and not just editor time right?

4 Likes

Now wasn’t that a massive waste of time for us all. Not to mention resources.
Many of us were telling you from the very beginning.

The whole idea that you present a “more scalable programmable system”, that you can’t make scalable and need to split it in two separate channels in order to work was dubious at best. Laughing in the face of logic.

(Not to mention that defending it, was the worst organized gaslighting against an entire community that loved and fully supported you more than any other organization at the time!)

I hope you base this on HDRP and scale it down through the SRP features.

A single powerful truly scalable system.

And I also hope that we won’t need to spend another 7 SEVEN years for it to mature. That alone pushed so many people to Unreal.

Having said that. I would like you or study this hard, play with it a lot, and make sure everything fully integrates this time around. See that water works with sky and wind and clouds and that the night and day cycle works and that there is a moon, and we can all start using it from day one.

4 Likes

I think its obvious having 3 graphics pipelines was poorly thought out. 2 types of UI, 2 types of Input, 3 types of shaders, etc. It is an unpleasant experience for new users.

Admittedly, I am sad to see BiRP go away as we can do so much more with it in the VR space, and this means all the tens of thousands of custom shaders will no longer work. And there is so much less documentation on custom URP shaders. Plus so many specific issues with URP that are still outstanding or intermittently resolved. We also have found BiRP in many Unity versions to outperform URP on various platforms - so there is that concern. That mixed with the entire URP team having quit not too long ago…

That said we’ve mostly left Unity because of the mandatory Industry Pricing, so I’m not sure we will be using it long enough for this change to matter.

But it will be interesting seeing all the projects it kills with custom shaders.

7 Likes

so a few years back I’ve tried to recreated the base HDRP material, the one provide by unity, recreate it inside shader graph. I’ve got it quite close but there were some details that I could not implement or get right because unity has denied access to the source of those shader/materials.

At the moment if you have nothing better to do with your life, you can download the entire unreal engine source and even compile the entire thing locally. Is not open source, off course, but you can look at the code and if you have some random tasks like the one I’ve given as example, you have more chances to complete it. Instead of loosing two weeks like me then hitting a big wall at the end because you lacked the info that what you are doing is impossible.

so as long as you keep parts of the engine hidden and access to source code restricted to corporate clients, I’m afraid this whole thing you are proposing is pointless for me.

1 Like

That’s why I always suggest Unity to share more, so we can help with suggestions or evaluating decisions and express our needs before it’s too late.

1 Like

In principal this sounds good, but not sure how HDRP users would adapt this if features are missing. I guess the short list you gave would probably be okay for most users, but it would get complicated to compensate for missing features like:

  • Raytracing
  • SSS
  • Water
  • Volumetric Clouds
  • HQ Line Renderer
  • Shader like StackLit, Eye, Hair, Fabric,…
  • DLSS
  • SSGI
  • SSR

Many of them have been already converted to URP by community, but stuff like Raytracing, SSS, Water and Clouds are more deeply integrated and would probably make an upgrade to the new unified shader impossible for most if missing. TBH this would mean more like you discontinue HDRP and beef up URP, which is fine from a strategy for many use cases, but not for the use cases which aim at AAA quality in terms of rendering.

6 Likes

Hi. Does this unified rendering will include merging entities graphics too as part of ecs for all effort in future?

Already implemented by the community too: Corgi Raytracing - Built-in RP (Forward + Deferred!) + URP (VR + SPI supported!) | VFX Shaders | Unity Asset Store :wink: