My problem is that Camera.RenderToCubemap in a VR project setup with SteamVR does not do the same thing as in a non-VR project. Also in the scene view the rendertexture that is written to in a [ExecuteInEditMode] script behaves as it should but as soon as I hit play, there appear these ugly black lines in the texture and the Camera only seems to render in circle and not in rects like usually.
Please help me! I really want to use this function is my Unity project!!
In the picture I display the Rendertexture that is rendered to.
Ok it seems like I figured it out on my own. As soon as a Camera is set to “Left”, “Right” or “Both” the functions Render() or RenderToCubemap() will both generate these black lines at runtime. Also if you call these functions from a OnRenderImage(src, dst) on a Camera, that is not set to one of the modes mentioned above, the lines will appear, which is the problem that was confusing me. So you will have to call the Render() functions on the Update() to render an image to a RenderTexture and then use the RenderTexture in the OnRenderImage(src, dst) again to apply your post processing effect… I hope this helps anyone!