So I decided to take a simple concept and work with it to make a game. But the main component of the game is that the player has to work with two the fps character having two slightly offset cameras operating at the same time on the same view port with the same depth, rendering different object layers. Ideally I would want them to operate like a single camera would, objects farther away on camera A rendering behind closer objects on camera B, and objects farther away on camera B rending behind closer objects on camera A. I set them up on the same depth, but they render as if the last camera set to that depth has a higher depth. Is there any way to truly set them to an equal depth?
You have two points of view, and you already have two cameras that capture those points of view in your scene’s Hierarchy. You have set the viewport of each camera to cover just one side of the application screen, left side for camera 1, right side for camera 2.
Your problem is that when you play the scene, only one camera’s view is displayed. The solution is that you need to choose one of the Camera’s to have a Camera.Depth=2, and also to set Camera.ClearFlag=DepthOnly. If you are drawing to the whole viewport with each camera, then you would also need to fix this by altering Camera.Viewport.
To clarify what’s happening. Each Camera overlays the application window with it’s view. The Camera with the lower Camera.Depth goes first. The Camera.ClearFlag setting determines whether that camera covers the parts of the screen it doesn’t render to with the Skybox or with a Background Color. Camera.ClearFlag=Don’tClear just combines one pixel color on top of another, but you want each view to be a stencil, since you are not creating semi-transparent layers.