(IN-59257) Fully immersive apps render with aliasing, even with multi-sampling enabled

Hi, we have a fully immersive app (using the legacy Metal rendering backend), and we see aliasing on geometry when running on device. We have 4x multi-sampling enabled in Quality settings, but it does not appear to render with multi-sampling on device.

Here’s a screenshot from the editor:

Is there a way to get anti-aliasing when running on device? Thanks!

I ran into this as well today. Here’s what I found from another thread:

Thanks for some insight on what’s going on here. To make sure this is addressed for folks like us that are fully immersive (i.e. using the Metal rendering backend) AND using the built-in render pipeline (i.e NOT URP), I created a simple repro project in the following Unity bug:

Hello, are the any updates on when this would be fixed in the PolySpatial plugin?

I’ll second that, as the current state of fully immersive rendering using URP and Metal is quite terrible, to be frank. :face_with_peeking_eye: :joy:

I guess that implementing dynamic foveated rendering in Unity is not an option since even Unity (as we) don’t have access to eye tracking?

Any updates on this? We’d like to analyze performance with our target visual fidelity (i.e at full display resolution/anti-aliasing settings), and this holds back that process. There is a repro project available associated with bug IN-59257…

1 Like

I can confirm I had the same experience. I also tried changing the render scale in the URP settings and it made no changes to how the scene looked when changing it from 0.1 to 2

Hi all! As @ryan_bednar noted from another thread I replied to last month, the low resolution and jaggy edges will be greatly reduced in future updates when we are able to enable fovated rendering. At the moment, we are limited to a small/low-res frame buffer which is scaled up to the native screen resolution. We’re wrapping up the engine-side changes to enable foveated rendering which enables Unity to take advantage of the full native screen resolution. It makes a huge difference.

Foveated rendering on visionOS will only be available for projects that use the Universal Render Pipeline. Projects that use the built-in pipeline will be limited to the low-resolution frame buffer, just as it is today.

Separately, we are working on an update to enable MSAA on visionOS. I’ll update this thread when I have more information on MSAA. Render scaling is also not currently supported, but we will investigate support for render scale.

1 Like

@mtschoen Thanks for giving us these details on the plan, super helpful. For those of us stuck with the built-in pipeline, is there a chance you can expose that buffer resolution so we can experiment with what resolution (without foveated rendering) we can still hit 90hz with?

Unfortunately the buffer size is not under our control. The CompositorServices API just gives us smaller framebuffer textures if foveation is not enabled.

Thank you for the update @mtschoen . Is there any timeline for the foveat rendering and MSAA in URP to be implemented for VisionOS support?

Yes, thanks for the update @mtschoen!

Similar to what @funvr was asking, for our internal planning purposes it would be great to know if any of these features have a chance of making it in before the end of 2023, or if they’re more likely to arrive in January or later.

Obviously we will still have some work on our end to ensure we’re hitting target framerates at the final resolution, so any more information here will give us a much better understanding of the earliest we could possibly be launch ready for the platform.

Howdy! I’m just checking in on this thread with a quick update. We released 0.7.1 versions of packages last month which, along with Unity 2022.3.16f1 enables fixed foveated rendering for visionOS VR apps. Have you had a chance to try this?

MSAA is still not working, but the higher resolution makes a huge difference, in my opinion. We’re actively working on MSAA but I don’t have a timeline yet for when that will land.

Thanks for checking in. We got our project upgraded to URP and are seeing the experience in high-res thanks to foveated rendering - makes a BIG difference! Still a few jaggies here and there, so looking forward to MSAA, but things have never looked this good, so thanks for the hard work :slight_smile:

1 Like

Same here. Looks great. :+1:

We had a problem with a third party shader sampling from the opaque buffer but were able to remove that part with minor visual impact. We did not solve the actual problem, though, but just to hint that there might be an issue there.

1 Like

Just got foveated rendering working, and it’s a game-changer for our app. Really great work. Thank you!

Hey, just started with fully immersive development, but I still find content to be quite aliased compared to content rendered by RealityKit. I’m on 1.1.4.
Are there plans for AA support down the line?

Will we be able to enable/disable Foveated rendering by changing rendering pipeline at runtime? We have scenes/bundles that are URP and others that are Built-in and we switch at runtime. Would we be able to take advantage of Foveated rendering then when in URP?

Hey there! Are you using the Universal Render Pipeline with foveated rendering enabled? MSAA should also be working for both URP and built-in pipeline on 2022.3.20f1 (and I think maybe one or two versions ago, as well). Do you not see a difference if you enable MSAA? If so, that might be a bug.

Good question! I haven’t actually tested what will happen when you switch render pipelines at runtime. I wouldn’t expect this to work, though. At the very least, you would need to close and re-open the ImmersiveSpace along with this transition, because enableFoveation is a setting passed to the CompositorLayer via CompositorLayerConfiguration. I suppose this would be possible, but it would require some modifications to the Swift app code and probably some careful coordination with the XR plugin setup, etc.

If you build with foveation disabled (and you’re OK with not using it), you might be able to switch between render pipelines, but that’s still iffy. I’d be curious to know what happens when you try, or if you are able to get something like this working. Unfortunately, at the moment, this capability is not officially supported. Please feel free to submit it as an idea on our roadmap so that others can vote on it and we can consider it along with other requested features.

Thanks for reaching out!