Multiple Displays using UWP and Windows Mixed Reality

I’m trying to build an app targeting UWP using Windows Mixed Reality hardware, where there are two displays: One is the VR headset itself, and the other is on a desktop, where a user sees another view separate to the VR one. For this, I have a camera for the VR headset which targets the main display, and another camera which is set to target Display 2, and both displays look correct in the editor during play mode, but not when built.

When I build and run the app, all I can see is the VR view from the headset, which is also duplicated through the Windows Mixed Reality Portal interface, instead of the other view I wanted to see.

UWP doesn’t seem to be able to render multiple displays like you normally would when building for PC Standalone. According to this thread from 2016, it’s not actually possible to handle multiple displays using UWP witout doing a hacky workaround such as creating another window externally and then streaming the game’s data to it, but this seems like a pretty horrible solution.

I suspect the fall-back will be to go back to using SteamVR instead, which will mean I can’t render the Windows MR controllers (but this will probably be fine as I have hands in-game anyway), which will at least enable me to build for PC Standalone instead of UWP. But if there are any existing solutions to this I would rather keep what I currently have.

Try this out: Unity Assets