OnRenderImage Not Working on IOS Device

So I am bliting the source texture to a destination texture using a material for a post processing effect. My post processing effect simply colors every pixel red. So I should see a plain red screen and this works as expected in the Pro version of Unity in my Editor. However when running on IOS with a Unity IOS Pro build, my effect doesn’t work at all. It looks more like the camera isn’t clearing its back buffer. What I am doing is very straight forward and I was curious if there are any known issues with using onRenderImage and Graphics.Blit(source, dest, material) on an IOS device using the Pro version of Unity. Thanks!

Can you post the shader you’re using? I realize it’s a simple test shader, but it’s possible you’re missing something that is causing undefined output.

using UnityEngine;
using System.Collections;

public class CameraFxs : MonoBehaviour
{
    public float Range = 0.10f;
    Material blurMaterial = null;
	RenderTexture texIm = null;

    // Use this for initialization
    void Start()
    {
        blurMaterial = new Material(Shader.Find("Custom/BlurShader"));
		texIm = new RenderTexture(Screen.width, Screen.height, 0, RenderTextureFormat.Default);
    }

    // Update is called once per frame
    void Update()
    {

    }

    void OnRenderImage( RenderTexture source, RenderTexture dest )
    {
        blurMaterial.SetFloat("_Range", Range);
		Graphics.Blit(source, texIm, blurMaterial, 0); 
		Graphics.Blit(texIm, dest);
    }
}
Shader "Custom/BlurShader"
{
    Properties
    {
        _MainTex ("Base (RGB)", 2D) = "white" {}
		_Range ( "Range", Range(0, 1.0)) = 0.0
    }

    SubShader
    {
        Tags { "RenderType"="Opaque" }
        LOD 200
         
        Pass
        {

            CGPROGRAM
            // Upgrade NOTE: excluded shader from OpenGL ES 2.0 because it does not contain a surface program or both vertex and fragment programs.
            #pragma exclude_renderers gles
            #pragma vertex v
            #pragma fragment p
             
			uniform sampler2D _MainTex;
			uniform float _Range;
			  
            struct VertOut
            {
                float4 pos : POSITION;
                float4 col : COLOR;
				float2 uv : TEXCOORD0;
            };
              
            VertOut v( float4 position : POSITION, float3 norm : NORMAL, float2 uv : TEXCOORD0 )
            {
                VertOut OUT;
                   
                float4 pos = mul( UNITY_MATRIX_MVP, position );
                OUT.pos = pos;
				OUT.uv = uv;
                   
                return OUT;
            }
              
            struct PixelOut
            {
                float4 color : COLOR;
            };
              
            PixelOut p( VertOut input )
            {
                PixelOut OUT;
                
				float4 c0 = tex2D( _MainTex, input.uv);
				float4 c1 = tex2D( _MainTex, input.uv + float2(_Range, 0.0) );
				float4 c2 = tex2D( _MainTex, input.uv + float2(-_Range, 0.0) );
				float4 c3 = tex2D( _MainTex, input.uv + float2(0.0, _Range) );
				float4 c4 = tex2D( _MainTex, input.uv + float2(0.0, -_Range) );
				
				float4 fc = (c0 + c1 + c2 + c3 + c4) / 5;

                OUT.color = fc;
                  
                return OUT;
            }
            ENDCG
        }
    }
    FallBack "Diffuse"
}

This line prevents the shader from compiling for OpenGL ES, which is what iOS devices use exclusively:

I recommend re-writing your shader and starting with this image effect shader template. It should be pretty easy to get the things you want in there, and the result will hopefully be less confused and messy than what you’ve got now.