I have a Render Texture, which is taking the output of a second Camera and applying it to a UI element, to create a ‘Cam’ effect of a place somewhere else in the level. This works fine in the Editor and Play View, but when I play a Build of the game, the Camera isn’t outputting to the Render Texture.
The Render Texture isn’t blank though, because I can see the Skybox, so it feels like the Camera just for some reason is only outputting what it’s set to be the background, and only in the Build.
Can anyone offer assistance to what might be happening?
if the skydrop is in there, but not the object you are expecting… then camera is rendering… but object is missing, so it could be a few things…
make sure the object is in front of the camera ( do a quick test just dragging in the object as a child of the camera and in front of it, then run and build that to see if the setup is working properly… just a cube object will do )
Make sure the layer of the object matches the layer of what the camera is rendering.
I’ve figured it out, my Camera was parented to the UI Canvas for the Camera View, and my build was on my monitor’s Ultra-Wide resolution, so the Camera was being pulled away from where it should be at that resolution. Just had to move it out of the Canvas, and no issues came from that, so all good now.