MSAA is currently not supported in Raytracing.
Are there plans to make it supported or any plans to add any other AA methods?
MSAA is currently not supported in Raytracing.
Are there plans to make it supported or any plans to add any other AA methods?
@SebLagarde
@chap-unity
@auzaiffe
@pierred_unity
@markvi
@unitybru
Apologies for tagging so many of you guys, but after trying to solve this issue for almost 2 months and looking at various 3rd party assets in the hopes of a fix, I’m close to throwing in the towel…
This is early stages of a much larger cinematic film (not a game and not a realtime app) which will use HDRP and raytracing (no pathtracing), however its impossible to clean up the AA on the GEOMETRY. Typically models we work with, will have a LOT of detail in geometry and it not for realtime apps, but mainly for film work. I’ve tried various combinations of TAA, but it just doesnt cut it. Can we maybe get some kind of spatial super sampling like that “other engine” use during offline output?
On the other hand as a test, if I switch of raytracing, and switch to forward rendering and enable MSAA X 8, everything is looking crisp and lovely, however then I loose all raytracing benefits such as Raytraced GI, Raytraced relfection, raytraced shadows, etc.
Also, I’m outputting this with Unity recorder, so posting this in both threads as the info may overlap to various teams in Unity
I seriously doubt they will support this, considering they haven’t supported screen-space effects with MSAA either in HDRP (meaning no SSR, SSAO, contact shadows).
I’d love to have this as well (and have asked about support for it in past for both SSR and RT reflections) but It also makes even less sense in raytracing since the raytracing result is quite noisy (so you’d need TAA to smooth it out) and people don’t usually use multiple AA solutions at once just by naively stacking them.
Since you want this for offline purposes, I would just render it at higher resolution so when it scales it down it removes the jaggies nicely.
That’s really not good news!
Even at 4K, there’s still jaggies. At 6K, still minor jaggies, and this simple test starts to take so long to output, that we might as well stick to traditional rendering or just enable Unity’s pathtracing. Even ChaosGroup’s Vantage renders it faster in realtime (and that is WITH pathtracing). I’ll give NVidia’s Omniverse a go and see how that does.
Can’t understand why Unity can’t just implement some kind of super sampling for Recorder’s offline output. As far I understand, Unreal’s raytracing uses TAA by default also, and when offline rendering they have super sampling X 2 implemented.
I see in Unity’s roadmap they’re considering NVidia’s DLSS, but who knows how long that will take to implement and get working nicely.
I looked at the Sherman short film example, and they used forward rendering and indirect baking, and the results are pretty good. Our type of films will have mainly long camera paths, and be mainly architecture based, and could be airport size. Cam paths could be far from buildings and detail could be half a pixel wide, or we could come so close that you could see the rivits on hanger cladding. But the whole point of our efforts with Unity is to make workflow easier, and faster, ie no baking and use raytrace without pathtracing. We may have to ditch our efforts and look elsewhere. We can’t all be like MPC and implement our own custom code to make things work as needed. We need very fast turn around in rendering even if we have to slightly sacrifice some quality compared to traditional rendering. However, this AA issue is a bit of a showstopper for us currently.
It may very well also be that we’ve just been approaching things wrong and may need to rethink things. Don’t get me wrong, we love Unity, it’s just this seemingly little thing is turning into a pretty big deal for us.
Maybe we’re just expecting too much from vanilla raytracing without pathtracing…
Open to suggestions and comments!
They are working on DLSS: Commits · Unity-Technologies/Graphics · GitHub but it’s hard to tell when we can use it… That branch requires additional nvidia module which we obviously don’t yet have access to.
Yes I saw that yesterday, here’s hoping that works great, soon.
It probably wont make it into the 2021 tech stream which seems to be coming out next week the 23rd…
It definitely will not be on initial 2021.1 release + Unity doesn’t usually backport features like this, but 2021.2+ could still happen if everything goes well.
Not sure we can sit around another 6 months to wait and see
A reply from one of the tagged Unity guys, or any other official Unity person, would sure be greatly appreciated, especially now that the 2021 tech stream is out…