Anyone able to do high quality captures of fully immersive apps with Reality Composer Pro?

We’ve been trying to take high quality screenshots and video of our fully immersive URP-based app, following these instructions:

Whenever we go into capture mode with the Reality Composer Pro tool, things only render out of one eye on the device and look all warped. The resulting capture only shows the passthrough hands and a single horizontal line of visuals.

Has anyone managed to get good results with this tool? We are on the 1.0 OS release and using Xcode 15.2…

Hi there! Unfortunately this is not something you can work around on your end. We will need to add explicit support for this feature. In the meantime, have you tried making recordings directly on the device? You can go to Control center from the indicator menu (look up and pinch) and start a recording by long-pinching on the record button. This will save video recording (or screenshot if you just do a regular pinch) to your device, which you can share to your laptop with AirDrop. At the moment, this is the only way I’m aware of to capture video from the device in a fully immersive (VR) Unity app.

Right. I should also mention that this might bump up against another issue where a VR app will crash if it is not connected to the debugger and you open the control center. You don’t need a debug or development build, you just need the debugger attach. You can either run your app from Xcode in Release mode, or attach the debugger to a running instance of your app by going to Debug > Attach with XCode connected to the device.

thanks for the response! I have used the on device recorder functionality, but it does leave something to be desired for marketing materials. Would be great if this could be supported in the future - I just submitted it as an idea for the roadmap.

Is there any update on support for capturing marketing material with Reality Composer Pro?

Nothing yet. We’re starting work on this soon but it’s not a trivial task. This mode requires the app to render different resolutions to each eye, which isn’t something we have had to implement for other platforms. As such, there are a number of places that need to be updated where we’ve made the (perfectly reasonable, IMO :slight_smile:) assumption that both eyes render at the same resolution.

I’ll circle back on this thread when we have an ETA for supporting this feature.

If you disable Foveated Rendering under Player Settings → XR Plugin Management → Apple VisionOS and then rebuild you can at least capture proper visuals from the left eye (right eye is still warped while recording).

Hi @mtschoen,

Are there any roadmap updates for this feature?

We’ve had Apple pass on us twice for marketing pushes without the ability to provide non-foveated videos.

We’d like to put our App and Polyspatial in a good-light, but continue to be hard-blocked by this issue.


Hey there!

We’re still working on a proper fix, but with com.unity.xr.visionos@1.1.4 we provided the ability to disable foveated rendering. This allows the workaround that @jdiehl_unity3d described above:

You should be able to make 4k recordings this way. I just recommend closing your right eye so you don’t get a headache! :sweat_smile:

Just to be clear, this thread/forum is about fully immersive VR apps built with Unity. This app mode does not use Polyspatial, which is our technology for Unity to use RealityKit for rendering in mixed reality. Are you trying to record a mixed reality app with Reality Composer Pro? As far as I know, that should work just fine.

Ah, thank you for the verification mtschoen! The footage from one eye does fit the bill.

We’re actually working on both fully immersive and mixed reality fronts, so sorry for the PolySpatial confusion.


Disabling foveated rendering reduces the overall app resolution considerably. I’ve also noticed a significant drop in performance when recording in this mode. Any way to get around this? Apple is asking me for promotional materials for my app using Reality Composer Pro and I’m at a loss on what to do. I really don’t want to lose this oppertunity with them.

Is this true in the final recording? Doesn’t the Reality Composer Pro capture process bump the resolution up to 4k? I’m pretty sure you have to disable foveation for this to work, since they don’t want foveation artifacts to be visible in the final capture. Unfortunately there’s not much else we can do about this until we fix this properly within Unity. We’re limited by how the platform exposes these rendering features, and it’s proving difficult to get Unity to work properly in this situation where we have to render stereo but with different resolutions in each eye. At this point, there is no better way to capture footage from a VR app on visionOS built with Unity.