Render to Texture Lagging by One Frame on iOS

In my game, I have some reflections. These are generated by rendering selected parts of the scene to a render texture which is then blended in the floor shader as a second input texture. This works fine on PC, but on iOS I’m seeing a one frame lag on the reflections.

Which implies to me that the reflection camera is being rendered after the main camera. The reflection shader is set to use the ‘Background’ queue, while the floor is using the ‘Geometry’ queue. The reflection camera is using a depth of 0, while the main camera has a depth of 10. (Basically things I’ve been trying to solve this.)

My code to generate the reflections is almost along the lines of the MirrorReflection2 example script.

In Start(), I have the following to initialise the reflection system:

mReflectionTexture = new RenderTexture( Screen.width, Screen.height, 16, RenderTextureFormat.ARGB32 );
mReflectionTexture.Create();
mReflectionTexture.filterMode = FilterMode.Trilinear;
mReflectionTexture.anisoLevel = 9;

mReflectionCamera = new GameObject( "ReflectionCamera", typeof( Camera )).camera;
mReflectionCamera.targetTexture = mReflectionTexture;
mReflectionCamera.fieldOfView = Camera.main.fieldOfView;
mReflectionCamera.aspect = Camera.main.aspect;
mReflectionCamera.cullingMask = 1 << 8;
mReflectionCamera.backgroundColor = Color.black;
mReflectionCamera.clearFlags = CameraClearFlags.SolidColor;
mReflectionCamera.depth = 100.0f;
mReflectionCamera.nearClipPlane = 1.0f;
mReflectionCamera.farClipPlane = 50.0f;

mReflectionMatrix = Matrix4x4.zero;
mReflectionMatrix.m00 = 0.5f;
mReflectionMatrix.m11 = -0.5f;
mReflectionMatrix.m22 = 0.5f;
mReflectionMatrix.m33 = 1.0f;
mReflectionMatrix.m03 = 0.5f;
mReflectionMatrix.m13 = 0.5f;
mReflectionMatrix.m23 = 0.5f;

mReflectionMaterial = ( Material )Resources.Load( "Materials/Reflection" );

While in Update(), I simply do the following:

Vector3	position = unitycam.transform.position;
Vector3	eular = unitycam.transform.eulerAngles;

mReflectionCamera.transform.position = new Vector3( position.x, -position.y, position.z );
mReflectionCamera.transform.eulerAngles = new Vector3( -eular.x, eular.y, eular.z );

Graphics.DrawMesh( mFaces, Matrix4x4.identity, mReflectionMaterial, 8, mReflectionCamera );

Where mFaces is a mesh I dynamically update each frame and submit for rendering twice each frame with different materials for each drawcall - one for the reflection, one for the main scene.

Basically, I’ve tried everything I can think of to get the reflections rendered before the main scene, and I’m stuck. Any suggestions or ideas would be greatly appreciated. As I mentioned before, this system is working fine on Windows and MacOS. It’s just iOS that I’ve noticed this issue.

Is this a known bug/limitation in Unity?

UPDATE: I should point out that before I tried the above, I was originally manually rendering the reflections using the main scene camera. The above was an attempt to get Unity to behave properly on iOS.

ok, I figured it out. Instead of deleting this question, I’ll leave the answer here in case anyone else ever encounters similar problems.

The issue was that my reflection camera was being processed before the main camera was updated for the current frame. This was mostly due to how I implemented touch and mouse input processing: Touches are processed during an Update call to my game manager, while mouse events are handled in OnGUI.