Graphics.Blit performance on iOS

So I’m basically just taking one 1280x800 RenderTexture and blitting it onto another of the same size- every frame- but I’m getting very low FPS when built to devices (1-3 fps iPad 3). Is that just the limitation of iPads or could I be doing something wrong?

1 Like

Performance can be impacted by many things.
What shader does your material use?
How many lights are in the scene?
What quality setting are you using ?

You should be able to get 60fps blotting a single texture, no problem.
Remember to Set the application frame rate to 60.

In what function do you do that?

@dreamora OnRenderImage

@pcg It is a 2d game that uses unlit transparent colored shaders for the sprites, good quality.

And where do you use the image?
Which occassion, how many of them

To get the intention of the question clear: I have the feeling that you are redrawing it in ongui or use it on many objects, in which case it would happen a plenthora of times per frame, not just once.

I’m using NGUI which batches draw calls and textures onto atlases. My main camera renders one sprite. I have some other cameras that RenderWithShader once at the start and save the result in a RenderTexture which I composite with the result of my main camera every frame.

I’ve seen something similar.
Same camera, same scene:

  • without any script attached to the camera: 2.8 ms
  • with blit source to destination in OnRenderImage in a script attached to the camera: 5.6 ms

The numbers come from the internal profiler, on iPhone.
There is another ortho camera in the scene, but it doesn’t have the script attached.

There are lots of issues in there.
First of all - even simple copy (blit) of screen-worth amount of pixels is LOT of work.
Second, about OnRenderImage - if you didnt set Camera.targetTexture to some RT - it will draw to screen and then, when you have smth in OnRenderImage, it will read back from screen to the Texture for you to use. Sure, this cant be fast :wink:

Alexey,
thanks for the reply! I’m not sure I understand completely what’s going on.
What I meant to do with this test was to “override” the camera render with an equivalent, without any additional processing. I thought that writing source into destination would be the only way to do that. And I only found Blit as a mean to do that.
I understand Blit takes time, and with my code I’m basically copying the frame buffer into itself, with the additional cost of the copy. I guess writing a shader would avoid that.

This was just a test for the thing that I really want, which is at every frame, save the framebuffer in a circular buffer, to save a “replay”. Since saving into a real texture in memory take forever, I thought I would just save an array of RT, and later on when I want to extract the images, do a read pixel with the RT set as active. Maybe it’s not even possible, but that was my wild guess, until I learn how to write a shader.
The guess, translated in code, would have been a OnRenderImage with a blit of source to a RT to save, and then again into destination.

I also tried to do the same thing with 2 cameras, one on a RT and the other one on screen, and it was actually faster than the method above. Then I thought, “I’m rendering twice here, let’s try to avoid that”. Hence the test with Blits in OnRenderImage that turned out to be slower than rendering twice :smile:

Again, this is just because I don’t know any better. If you have suggestions on how to do it, you are super welcome :slight_smile:

What kind of framerate were you getting with rendering twice? Blit I would expect to give you around 1-2 fps because it’s expensive…

Rendering twice was going from 2.8ms to 3.5ms.
With the Blit it goes up to 7ms.

For the second point, I use RT for post process. At the final process, I copy the final image to the destination by OnRenderImage(). However, once render texture ‘source’ or ‘destinaiton’ is involved with Graphics.Blit(), the FPS goes down to 1 frame per sec.
I checked the debugger in xcode, find out that there is a process named ‘AASolve’, which is labeled by Unity, it uses glCopyTexSubImage2D()! It’s a anti-performance API! And my test shows that indeed this API beats the performance.
It’s weird that I turned off the Anti-alising in QualitySettings, the “AASolve” is also existed in the rendering pipeline.
My device is New iPad, iOS 5.1.

Is there any explaination?

well, the problem here is that we must provide api for poor souls that dont have any idea about rendering.
if OnRenderImage() is present on Camera that renders to screen (not to texture) then source texture would be what? So to fill it we do read-pixels from screen. To avoid that please blit to screen in a different place :wink: The easiest (well, just for you to test) would be smth like OnPostRender, where you would explicitly set RenderTexture.active = null and so Graphics.Blit with source=RT that you just rendered into with your camera.
I hope this makes sense :wink:

1 Like

Thanks, I’ve worked it out that way. It has no performance issue, however, OnRenderImage() could be a good way to organize post-process-chain. Put post process into OnPostRender() is OK, but hard to present a independent pipeline.
While camera is rendering to screen, I have a personal suggestion: Unity could prepare 2 swapping buffers as fake screen, they could be the ‘source’ and ‘destination’ in turn in function OnRenderImage(), at the very end of the pipeline, render the last buffer to screen. In my opinion, the cost is acceptable, besides, HDR and Gamma Correction could be implemented in this step. What do you think?

Cannaan , can u give me some favour to deal this problem, i meet the same problem, but i dont clearle what u said, sks

Hi Alexy, I would love some help with how to do this with the Image effects. I have created a post here: http://forum.unity3d.com/threads/graphics-blit-performance-on-ios.153778/

To give a clear example, here’s the code we’re using in an upcoming version of FxPro 3.0:

    public class EffectBasePrePost : EffectBase
    {
        private RenderTexture _destinationRenderTexture;

        public void OnPreRender()
        {
            _destinationRenderTexture =
                RenderTexture.GetTemporary(EffectCamera.pixelWidth, EffectCamera.pixelHeight, 8);

            EffectCamera.targetTexture = _destinationRenderTexture;
        }

        public void OnPostRender()
        {
            EffectCamera.targetTexture = null;
            Graphics.Blit(_destinationRenderTexture, null, Mat, 0);
            RenderTexture.ReleaseTemporary(_destinationRenderTexture);
        }
    }

I hope this will be useful!

3 Likes