Why is URP still so slow?

URP from my testing is still really slow, even in some basic scenes. I was excited to see the work done with the Batch Render Group renderer, and performance is looking a lot better there (sincerely, good job to the team working on that), but also it seems even with that, there are some really obvious places for improvement.

So that’s all well and good, but why has it taken so long? Even assuming those changes are enough for most projects, that still puts it at 2022 LTS at the earliest for anything production.

What happened to “performance by default”?

It’s concerning that it has taken so long, because I think it shows a lack of strong management. Ever gamedev/gamer knows that performance is even more important than features. No one wants to play a game, no matter how good it is, if it runs poorly.

So I am just asking, why has URP taken so long to become performant (and still needs work on top of that), and how do you show the community that this is just some extreme act of negligence, rather than a lack of concern for performance, given that’s how it looks?

There are gamedevs like me on the forums, who are seriously weighing out the options of moving away from Unity to some other solution for their next project, because of feature or performance concerns.

How is Unity going to improve in this area, specifically for URP performance?

To recap, my questions are :

Why has it taken so long for performance to improve for URP?
How is Unity going to show that they are serious about performance for URP now and in the future?

4 Likes

Hi Joshua,
this postmortem covers some of the history behind our choices.

This is still very much a priority. Rewriting the whole render pipeline from scratch has been a huge multi year project. The work has been prioritized based on feedback from users, where some needed a feature more than an optimization of another feature. The goal for the SRPs is to be the most scalable render pipeline to offers the best performance on all platforms. A lot of foundational work has happened, like the URP RenderGraph adoption , that set us up for a much higher velocity to add features and performance improvements in the near future.

1 Like

Can you share a little bit more detail about this? What’s your device/platform, scene characteristics and runtime data?
If this is also a case that you might experience URP being worse in performance compared to built-in RP in a scene or platform then it would be helpful as well to have a bug report. We consider any case that URP might be worse than built-in RP as a regression.

A little bit more into the plans. We are focusing our efforts into delivering URP Default and overcoming built-in in every way, including performance. We have internal performance tests that show URP is running faster than built-in in the platforms and scenes/demos we tested. We will share more about this but before we want to test on wider projects and platforms to be able to tell you confidently the full picture of how URP compares to Built-in RP. This takes time because the surface of testing is huge (25 platforms, multiple projects). Even if you just consider Android platform that has a intense fragmentation in different types of hardware and processing power. The fragmentation of content and platform was one of the very reasons we decided to move from the rigid black box Builtin-RP renderer to an open source SRP, that allows for great customization and flexibility for advanced users.

I also want to acknowledge that we are aware of some gaps in performance of URP when it compares to built-in. F.ex, the Lit shader scalability settings is missing a low quality for URP when compared to built-in RP. This is an work in our backlog that we aim to finish before URP is enabled as default rendering solution.

Did official test with real production game project for multiple platforms specially low end android mobile platform?

It depends. When it comes to performance testing we have two goals.
Low end Android mobile is a gap in our test automation now that we are working to fix.

1) Validate Built-in vs URP performance
In this case we don’t have a real representation of a game, mostly because it’s costly production to have a vertical slice of a game produced running on both URP and Built-in RP.

We have two demo projects so far running on Built-in RP and URP, with these project we test realtime lighting, shadows and post-processing on both pipelines.

One of these projects was made in built-in RP and it’s a straight conversion from Built-in RP to URP using our Render Pipeline Converter. The goal is that we test what is the performance comparison if someone just updates the pipeline with our tooling.

The second one is Viking Village and the settings have been configure for each pipeline. The idea is that URP exposes new optimisation settings and features not available in built-in so we tweak the project to take advantage of those. The goal is to test how much URP is faster than Built-in RP if the user knows how to configure URP optimally for each platform.

2) Validate URP regression when compared to and older version of URP.

In this situation we have a mix of small demos and closer representation of a real game with Gigaya and Boat Attack. There is automation on some of the small demos / scenes, but the automation is not running on newer versions of Unity for Gigaya and BoatAttack so we still rely on manual testing and we miss wider platform coverage.

We are putting a lot of effort into increase the test coverage and cover the gaps here.

Why is this the case? Built In has basically been abandoned for years so the only changes should be to keep it up to date with URP.

Or are you saying that keeping a real game on the latest URP takes too much time currently to be feasible?

It’s not about cost of maintenance, it’s about cost of producing the vertical slice for both pipeline in the first place. We have Gigaya and Boat Attack and we maintain these demos, however it’s not trivial to convert those to Built-in RP.

This is why we have smaller graphics demos, to test URP vs Built-in, but to user question they are not representation of real games.

So would you agree that projects like Gigya have been valuable to help validate performance?

By extension, would you agree that Unity is missing a representation of real games to validate performance against?

I want to stress that this next question is directed towards upper management.

If both your users and developers within Unity agree (not saying Phil does agree yet, I am just assuming/inferring this) that a larger game project is something Unity is missing to validate performance, why was Gigaya dropped and why wasn’t a games company bought to plug this gap?

2 Likes

To give context to my frustration here, Unity management felt a $1.65 Billion investment into Weta was justified as a way to break into films.

However Unity is supposed to be a games engine first, yet I don’t see even a $1 Million investment into say a small indie games company.

If instead of buying Weta Unity had bought a company similar to say CD Project Red then I would have no doubts that Unity was serious about games and was putting their money where their mouth was.

yes, certainly. Our entire group was also disappointed about what happened to Gigaya. The reasoning behind that is out of scope for the SRP blitz day. Upper management needs to make tough decisions that we don’t have the full insights to.

6 Likes