Hi. I’m trying to build a XR immersive experience in Vision Pro where a player decorates a garden with furnitures and miscellaneous items using XR Interaction Toolkit.
2nd dimension to this experience is the onlookers: they should be able to see the experience from outside Vision Pro through a projected screen. Instead of viewing the experience through the eyes of the player through airplay streaming, we’re trying to show the experience from a different perspective within the scene.
To accomplish this, I’m trying to use a secondary camera in the scene, render its view to a Render Texture, encode the texture images and send them to a Mac through UDP protocol and wifi connection. Mac will be connected to a projector, which will project the Mac’s screen onto a mesh screen, which the onlookers can watch.
Here’s my problem: I can’t get a secondary camera to render to a Render Texture. I have set up a test method in the scene: I added the render texture to a raw image on a canvas to ensure the rendering works without the networking part. The Rendered Texture shows up just fine on editor both in and out of play mode. However, as soon as I enter either the simulator or the build through Vision Pro device, nothing shows up in the raw image that’s meant to be playing the render texture.
Here’s my question: In an UnboundedVolume scene using PolySpatial, is it possible to render a second camera other than XROrigin? If so, how would I go about achieving this?
PolySpatial version 2.0.4
Unity Version: 6000.0.26f1
secondary camera setup:
I can’t put other editor images since I’m a new user. which is a bummer.