Composition Layers Experimental Package is now available

We’re thrilled to announce the experimental launch of the Composition Layers Package! Composition Layers are designed to significantly enhance the rendering of graphics in XR applications, providing a notable improvement in the visual presentation of text, images, and UI elements such as menus, along with a marked enhancement in video quality. Greater visual fidelity is achieved without an increase in element size or overhead.

Key Features:

  • Cross-Platform Solution: The Composition Layers package works for all OpenXR runtimes and compatible hardware.

  • High-Quality Rendering: By bypassing one of two sampling processes, Composition Layers reduce the visual artifacts in the render pipeline. This results in enhanced image quality, offering clearer and more engaging visuals.

  • Improved Video Quality: Videos rendered via composition layers result in larger, clearer screens suitable for displaying video content in XR environments.

Getting Started Tips:

  • Compatibility Check: Make sure you’re using Unity 2022.3.7 and up. Both Built-in Render Pipeline and URP are supported.

  • Prerequisites: Before working with the Composition Layers Package, make sure you’ve imported the Open XR 1.11.0-exp.1 Package into your project. Additionally, there are some graphics settings that you must adjust to ensure that the Composition Layers display correctly in the Editor and at runtime.

  • Current Limitations:

  • Composition Layer Interactive UI components currently only support Quad and Cylinder layer types, and a single canvas per Composition Layer UI.

  • Single Pass Instanced rendering is not currently supported. We are planning to add support in the future.

Additional Notes:

  • You can install the Composition Layers Experimental Package via the ‘Add package by name’ dialog box in the Package Manager. See Installation Instructions for a step-by-step guide on how to add the package.

  • Composition Layers are emulated by default in Scene View and Game View while in Play Mode. When not in Play Mode, Composition Layers are not emulated in Game View. Emulation can be disabled in Unity Preferences.

  • Emulation for Build Mirror View Window (PCVR build target) is disabled by default. You can enable emulation in the CompositionLayersRuntimeSettings asset.

  • It is recommended to use fewer than 15 Composition Layers per scene.

For detailed instructions on working with Composition Layers, refer to our Composition Layers documentation.

As a reminder, Composition Layers is currently an experimental package and should be used with caution in production applications. You can find more information on using experimental packages in Unity here.

We’re excited to see how you integrate Composition Layers into your projects and look forward to hearing your feedback. Feedback can be shared on this thread, or through our XR roadmap Compositor Layers card.

10 Likes

How about HDRP support?

Could this be part of the Open XR package instead of having yet another package to deal with?

Putting everything in the same package is not ideal for iteration velocity and it makes it more complex for QA to review changes to let things get out the door. This package has a lot in it, and it would weigh down the OpenXR package a fair bit.

3 Likes

This is so very cool! Kudos to the XR team who are making a lot cool releases recently.

Saw “Behind the scene there is a hidden camera that captures the texture of the Canvas” in the Composition Layer UI section. Is it possible to use it as a standalone feature to get the texture of UI canvas?

Can’t even count how many times I had to set up custom layer and camera for capture canvas texture in Unity. I am sure a lot of Unity devs working with UI could use this streamline process.

Awesome!

I was looking on the Provider implementation guide
https://docs.unity3d.com/Packages/com.unity.xr.compositionlayers@0.5/manual/provider-guide.html

I want to implement WebXR Layers. For that I need a reference to the Texture or RenderTexture of the layer, to then use in WebGL calls when the WebXR API allows that.

Just to make sure that I got it right:
When a developer sets a CompositionLayer component with TexturesExtension component, I understand that I can get the texture reference from the TexturesExtension data.

And it’s the same when a developer sets Composition Layer for Interactive UI, the Projection Eye Rig will have a CompositionLayer component with TexturesExtension component that renders the XRI Rig.
Then I can get the RenderTexture reference from its TexturesExtension data?

There are no native graphics calls that Unity does for this? It’s all on the provider to implement?
If so, that’s great :slight_smile:

2 Likes

Installing another package is not the end of the world, but there definitely seems to be a trend within Unity in recent years, “let’s just do whatever is easier for us”. User ease of use doesn’t seem to be a priority anymore.

And while I’m complaining, it would be good if the documentation showed an example of the difference using Composition Layers would make for a UI to give devs an idea of whether it would be worth implementing.

1 Like

Thank you for providing early access to the feature for feedback! So far it looks really good. Are there plans to also expose the “Default Scene Layer” through this package? At the moment there doesn’t seem to be a way to customize it and add extensions to it, such as the Color Scale and Bias extension.

And the passthrough layer also doesn’t seem to be exposed. Is that planned?

Thanks for releasing this package! It’s been a long time coming. I would also like to author a standalone implementation of XR_FB_passthrough, I’ve had a pure c# version stubbed for a long while that only required the layer submission to finish it off. Is this now possible with this package?

Good idea; showing goes a long way here and something we definitely want to do.

The difference is super evident when you look at it side by side on device, but video capture quality makes showing the actual contrast a little more challenging. That said, we’re actively working on getting captures and hoping to have more visuals to share soon.

1 Like

Yes, there is no native graphics calls, it’s all on the provider.

Let us know if you run into any issues trying to implement your WebXR provider, we will try to help unblock you.

1 Like

This is a great feature request! We will investigate this and see if we can solve it in future releases. Thanks!

1 Like

Yes, it’s possible to implement passthrough composition layer with this package. And we do have plans to give users access to a passthrough layer, FYI.

2 Likes

Only BiRP and URP for now since those are the recommended pipelines for XR.

We made composition layer package as a separate package by design as it allows other plugins (besides OpenXR) to implement their own layers support . See layer provider section in the doc for references: https://docs.unity3d.com/Packages/com.unity.xr.compositionlayers@0.5/manual/provider-guide.html#layer-provider.
Thanks.

3 Likes

Hey Mike,

Could you provide some more details on why you’re interested in authoring your own implementation of XR_FB_passthrough? Understanding your perspective would really help us prioritize this implementation effectively. Thanks!

Hi Sam, sure thing.

It’s less of an issue with layers and more OpenXR in general. At the moment I feel that the point of OpenXR is failing, especially where Android is concerned; we have the standard, but each device vendor is shipping their own SDK and implementation of the same loaders/extensions that are incompatible with each other. For example, Pico’s OpenXR implementation does support XR_FB_passthrough, but I can’t use Meta’s SDK to address that, because it’s wrapped up in the OVRManager shared binaries that are specific to Quest, and vice-versa. Each vendor’s SDK complains when the opposing vendor’s device input profiles are present, even though they’d do no harm as they wouldn’t be loaded. Ultimately, if this was a pure OpenXR system akin to something like StereoKit, then if a feature wasn’t supported it would just be ignored. There’s no reason we can’t have a single build across OpenXR devices. This is before we get into the politics of the various Platform APIs!

For context I’m one of the developers of Open Brush, and we currently have 5 separate Android builds for what should ideally be 1, 2 if you count China specific builds. Correctly set the androidVersionCode on a build rerun (#645) · icosa-foundation/open-brush@2cdf78e · GitHub, all blocked by various artificial limitations of the vendors, even though the underlying spec should enable them to be a single build.

Therefore, I’d really like to build our own suite of extension implementations that live outside the vendor SDKs, where we can use the cross platform loader and guarantee portability. I’ve already authored a version of XR_FB_display_refresh_rate (FBDisplayRefreshRateFeature.cs · GitHub) that works, and I’m wanting to see how far we can take it. This was my WIP of XR_FB_Passthrough: FBPassthroughFeature.cs · GitHub

/rant! Thanks for listening :smile:

3 Likes

Hey Mike,

Thank you for providing detailed context and sharing your efforts in working around the limitations and inconsistencies between different vendor SDKs in OpenXR implementation.

Have you tried Unity’s OpenXR (Meta) packages instead of Meta’s SDK? Our OpenXR (Meta) packages are supposedly compatible with all other OpenXR-compliant runtimes. But I totally understand your frustration, as this goes against the purpose of having a standard like OpenXR.

Thank you once again for sharing your thoughts and experiences. We will seriously consider your request and the issues you have raised. If there’s anything we can do to support your efforts, please let us know.

3 Likes

How can we capture a screenshot when using OpenXR layers? Unity’s screenshot capture function only captures eye buffer but not other layers.

You can enable emulation of the layers in standalone win, mac, linux build and then take screenshots with the system’s screenshot tools: https://docs.unity3d.com/Packages/com.unity.xr.compositionlayers@0.5/manual/project-settings.html#runtime-settings

Or if that doesn’t work for you, then you can also take screenshots in the runtime like the Quest, or via Sidequest for higher resolutions.