Background of Snapshot using RenderTexture

Hi, guys. I’m now trying to make a snapshot of some game objects in the unity scenes. I use a camera and override the OnRenderTexture(RenderTexture src, RenderTexture dst) method. I take the src, convert it into a Texture2D object (TextureFormat.ARGB32) and use the EncodeToJPG method to get a byte array. Lastly I save this array as a file.

It works fine in the Editor, but when I use it in a real iOS device, the captured picture has a black background. My camera has a background color of Color.clear and my Texture2D has a format of TextureFormat.ARGB32 so the picture is supposed to have a transparent background.

Could any one give me some hint what could be wrong here? Thanks.

When setting up the Camera object, a RenderTexture is needed as the targetTexture for rendering. If you use a dynamically created Camera, you have one by default whose depth is 24. But it seems iOS devices only support depth = 0, so there are some problems.

To walk around, add a statement like this:

myCamera.targetTexture = RenderTexture.GetTemporary(myWidth, myHeight, 0, myFormat);

where the third parameter is the depth of the RenderTexture.