Unity RenderTexture with RawImage is black (iOS)

I set up a camera with a RenderTexture and a Canvas with a RawImage object. The RawImage is able to draw the RenderTexture while testing in the editor without issue. However, when deployed to our iPad mini (running 7.1) it just comes our as black. Does anyone have an idea of what the issue here is?

  • The RawImage is full screen and anchored as such
  • The Canvas is overlay
  • The camera creates its RenderTexture at runtime
  • The camera re-sizes itself to be the size of the canvas on Awake

The reason I am doing this is to record particles for use in the canvas. I have attached a sample project in hopes that it will help. [36665-iosrendertexturetest.zip|36665]

I finally got back to this issue and had a chance to investigate it further and finally solve it.

After running a bunch of test builds I identified that the issue was with the Camera and not the actual use of RawImage. Though I do not fully understand why, it seems that the particleCamera/renderTexture was preventing stuff from being rendered to the screen. My solution was to update the ParticleCamera class so that it would call render on the camera once per frame as part of a co-routine and disable the camera altogether so that it would not update itself.

I have attached my updated classes for anyone that may want to take a look.[38386-particlelayer.zip|38386]