There is a problem I have encountered in all versions of Unity 6, including the most recent one. When using RTX, there is excessive unnecessary CPU usage. This is visible in play mode. When play mode is off in the Editor Game screen, the normally expected render thread utilization is a bottleneck in the game. This was not the case in previous versions.
The biggest problem this gives us is that since it puts the entire processor in a bottleneck, none of the resolution-reducing systems such as DLSS provide a performance increase and RTX is completely garbage in terms of performance.
This performance degradation is present both in editor play mode and in the games we build. So we tested outside the editor as well.
You can also observe this problem in a simple sample scene or in any of Unity’s demo scenes.
Unity 2022 LTS does not have this problem. In Unity 6 b16 the problem persists. I don’t think this is by design, if so we will have to remove RTX completely. It’s a complete fiasco.
I’ll send the bug report, but it would be nice to see an explanation here.
RTX in general seems very undercooked and undertested.
The GI same as the SSGI is very weak and in most cases not noticeable and cannot be tweaked properly
The AO is not a huge upgrade over SSAO with high quality settings You really dont get the macro AO effect that a offline raytracer or even lumen can do.
The Reflections are unnoticeable if you have a solid cubemap setup but I suppose this makes sense
Overall RT you need to build your scene around it to actually see some gains.
The same issue is with the unity SSGI, the addition of it to the lighting pass is somehow not great and is barely noticeable in most scenes. Its not directly the RT but the general math.
The SSRT third party implementation is flawed but gives much more noticeable results in any scene in terms of AO and GI.
Also when we checked a year ago there were insane blunders in the code, such as not using a list of meshes and iterating over all tens of thousands of object every frame (despite a list being already saved) completely tanking performance even if all features are disabled, I think that has been fixed tho, but it just shows how undertested this all was for a long time.
Nobody seems to care about raytracing or seems to want to do some improvements to it.
It exists on paper but the key part, AO and GI are very underwhelming.
Maybe this could just be fixed by exposing more settings.
Yes, unfortunately, it’s exactly as we predicted.
Once again we are left with an unintegrated feature that was created by some newly graduated students. We managed to get good visual results, albeit with great difficulty, but given the performance and other issues, it is completely unusable for any game.
In real time ray tracing (not path tracing) in HDRP, the direct lighting (in every bounce) comes from the rasterized shading pipeline. The indirect lighting comes from the ray tracing effects (RTGI, RTR). This is also why it’s called “indirect” diffuse in the volume override. This is also the reason why RTGI and RTR only use BRDF importance sampling, and not NEE (Next Event Estimation) of the light list.
It is actually a really nice and efficient “hybrid” designed real time ray tracing architecture if you know how it works and how to set it up correctly. If the result is underwhelming, then you either have a lot of direct lighting available, or too aggressive denoising on the indirect lighting.
The RTAO is as good as it gets, it can not get any better. I have coded multiple AO shaders in the past and HDRP’s RTAO is simply “how RTAO is supposed to work”. However, I agree that the denoisers are at least outdated now, and this is also the reason why there are no “macro effects” as you call it, because the denoiser is blurring them.
As for the “blunders” you mentioned, these were fixed in september 2021 with the introduction of CullInstances API.
If you think HDRP’s ray tracing features were “created by some newly graduated students” then you are very, very wrong.
All right, then, back to the topic at hand. About the guys who integrated the RTX module, if we look at the last 5 years of engine history, they have not been able to get out of scenes with cubes and 50-100 prefabs.
I think my accusation has a grain of truth again, according to this critique.
Let’s be more specific:
A game engine is the tool used to develop a game. If you make an effort to integrate software developed by another company (in this case, I won’t talk about funny situations like the crashes that the new FSR2 throws at the engine and the games), you shouldn’t leave it unfinished. RTX is a system that is not ready to be used for any game, left alone with its problems.
In that case, if there are those who say that you should develop it yourself, which we already do, then we’re back to the beginning of the topic. We started Unity because it is a game engine and we found some of its work logical. But we see that time and time again we are left stranded at every point!
As someone who has worked full time in AA and AAA games for 17 years (10 years in Unity/4 years in UE), what I have to say on this subject can be very annoying. But I share with you my criticism about only a small part of 1 in 100 of them.
If there is a problem, let’s stop covering it up. Let’s focus on the solutions.
Thats not what I said. The HDRP team seems to be the most competent team in Unity personally, however they lack real use case testing
I am probably wrong about the AO, however the GI is very underwhelming across the board and don’t come close to even low res software voxel lumen, any ancient offline renderer or even SSRT third party screen space solutions
We have many different environments and with very heavy color usage, all across the spectrum.
SSGI in non raytracting does no noticeable global illumination, only in edge cases. SSGI however for some reason does pretty solid AO in varied areas. SSGI in the end costs a massive amount of resources and is not or barely noticeably in 90% of the surface areas, and the same holds true for the RT version. I am sure this is coded correctly, however there is no exposure parameter or similar exposed to have some sort of artistic control over the GI or it must assume some very specific combination of lighting and exposure.
It dosnt work well at all in any environment ive tried. Some are bright purple, red, blue industrial, they make heavy use of color across the board, while SSRT gives instantly noticeable results in all of them.
If DX12 is finally usable ill do some more tests on the RT version and post some comparisons
In Unity 6 b15, DX12 now gives almost the performance of DX11. We didn’t have much trouble with this. Now we see that a good job is being done in this regard.
Everything gets messed up when RTX is on.
No custom solutions and shaders were used in the project. This is the case with the default engine contents.