I made a test scene with 1 character and 1 directional light. Quality is medium and I disabled Script Debugger in the editor. I used Unity 2023.2.18. I played maximized in the editor and ran the build as full screen. I have 2080 Ti, i9 9900K, 64 GB RAM, 2 x 144Hz 2560x1440 Monitors. After this test using URP looks the best for me. Does this test result look normal? I think HDRP is for really high-end devices.
For starters ‘high end device’ is a really really broad concept these days. Some laptops you can buy for under a thousand US dollars would probably fall into that category. What are you targeting as your lowest-end machine? If you are going by your own specs then you are absolutely in the high-end spectrum. On the other hand if you are targeting say, ten year old desktops, mobile devices, VR headsets, or I dunno, embedded software in a vending machine, then that’s a totally different story.
But in either case your metric here is completely off. A single model in an empty scene with a single light is not going to tax any hardware in any meaningful way. Further, you absolutely should never use editor performance as anything other than a very rough relativistic metric (i.e. to see if large changes are making improvements or not, but not for actual frametimes). Always make a build and run it there to see how it will behave in a real scenario. The difference is absolutely night and day… once you start to push limits though. Like, I said before, your current scene is not going to even make an old machine break a sweat.
As for which pipeline to use: I suggest listing out what your requirements and desires are for your project - both in terms of what you want in the final game AND what you want as part of your own production pipeline. For example, my current project absolutely requires many many unique materials so I cannot use the Built-in pipeline. If you want some solid features that are well documented and battle-tested and a lot of older tutorials and assets to be compatible then Built-in is the way to go. URP is a great choice if you are aware of what benefits and costs it has associated with it. It can drastically improve performance is some specific situations but you’ll also find that like 90% of the shaders and materials on the internet will not work with it so you’d better be comfortable with writing your own or using shader graph. HDRP has some slick features that aren’t found in the other two but it also seems to be the most unstable so unless you are very comfortable with this stuff I’d steer clear for now.
This is kinda like testing car performance by putting them on neutral and pushing them off the driveway. Give the engine some actual work to do!
Not to mention that using FPS to compare relative performance is utterly useless: the workload needed to drop from 1000 fps to 500 fps is extremely small (1 millisecond/frame), while the workload needed to go from 50 to 30 fps is a lot higher (20 milliseconds/frame). FPS don’t increase linearly with performance, so you really aren’t measuring anything meaningful with your test.
When comparing performance, always use ms/frame as your metric. And use a workload closer to a real use case, a single character with a single light will barely make the gauge move.
It is a simple test to test a simple case. That’s what I need. I don’t need to test all details with a complex test. It is good for me to see simple projects cost too much when you use HDRP. I just need help for results, if I did something wrong for HDRP. As I see, For simple cases, URP is 4 times better than HDRP.
If your game consists on one character in an empty scene, then it’s fine. But in that case why would you want to do any tests to determine which pipeline is fastest? you can use any pipeline!
Once you add some more stuff (not a lot more, just a few more things) and you get to a typical range of 60-120 fps, the differences between pipelines will be a lot subtler.
The flaw here is that you aren’t actually testing anything performance wise. Things generally don’t scale linearly in terms of cost so you can’t make any kind of prediction by extrapolating what you see in those results. There can be a lot of overhead with some systems ( for example, deferred rendering paths or the entire hdrp pipeline) that have an initial cost up front but provide other features almost completely for free. Sometimes you also find that something that is cheap to use in simple cases hits a huge problem when you find it scales exponentially. You need to start hitting limits before you know what they even are.
You should try to determine what kinds of limits you expect in terms of characters, scenery, lighting, etc and then try some simple tests to see how those limits react when reached. For example, if you want terrain that can be seen from two kilometers away and has a few thousand trees, buildings, and rocks, and has some animated NPCs all with real time shadows and lights then put together a quick scene and see how it ends up. More than likely you’ll need to optimize some stuff to get decent results but at this point you’d have an example to show and ask what steps should be taken to improve performance.
Why are you discussing if my test can be used to compare rendering pipelines for all games/platforms? or if fps is the single criteria? please just read my first post. I just did a simple test and got some results. I found HDRP fps is too low. I just asked if this is normal. It is a simple question for a simple case. So far no one has written about if this is normal or not. Again; I am not discussing or asking perfect performance testing for rendering pipelines.
It’s normal. HDRP is way more complex than either BiRP or URP so its base cost is slightly higher.
However we’re talking about a difference of 1.5 ms/frame between URP and HDRP according to your test (2000 fps = 0.5 ms/frame, 500 fps = 2 ms/frame) which is nothing to base a decision on imho. To put things into perspective, 1.5 ms is the difference between 68 fps (14.5 ms/frame) and 62 fps (16 ms/frame).
So basically you’re deciding on URP because -assuming your results can be extrapolated to a typical scenario- it may be 6 fps faster than HDRP.
HDRP has a high base cost indeed and is easy to destroy the fps with excess quality, effects, ray tracing etc. also, so only use for max realism.
URP has 3 rendering modes - forward, forward plus and deferred - so choose the fastest for your lighting and effects needs.
Built in may be phased out eventually…
I would choose URP for all cases unless you are desperate for maximum real-life quality, but even with URP you have 3 renderers to choose from and LDR vs HDR, so it isn’t like Unity make it easy!
Yes, in the same sense that profiling an empty method can show you the overhead of the call itself. HDRP is a very frontloaded render pipeline compared to BiRP and URP but that comes with the advantage that it scales better with scene complexity. It’s easy to miss if you don’t have a complex test project.
HDRP’s default settings (high, medium, and low) aren’t optimal. I remember the environmental artist for one of my work projects went into the settings and tweaked them recovering a fair bit of performance.
Kinda. It’s more accurate to say that it’s for targeting high-end graphics running at low frame rates. For the above project we were targeting 4K@60Hz because we were targeting the consoles but we ended up dropping down to 1600p to maintain the 60Hz because it was critical to the gameplay.