Texture.ReadPixels and iOS Problems

I seem to be having an issue when it comes to reading the display buffer of iOS6 devices. After a call to readpixels, the texture is just black. It works fine in the editor and on my android devices, but on iOS6 devices it doesn’t work. I found some documentation that said that apple made some display buffer changes and that glReadPixels now functions differently. I tried a work around that involved changing a boolean in xcode and it is as follows…

extern "C" void InitEAGLLayer(void* eaglLayer, bool use32bitColor)
{
    CAEAGLLayer* layer = (CAEAGLLayer*)eaglLayer;

    const NSString* colorFormat = use32bitColor ? kEAGLColorFormatRGBA8 : kEAGLColorFormatRGB565;

    layer.opaque = YES;
    layer.drawableProperties =  [NSDictionary dictionaryWithObjectsAndKeys:
                                    [NSNumber numberWithBool:YES], kEAGLDrawablePropertyRetainedBacking,
                                    colorFormat, kEAGLDrawablePropertyColorFormat,
                                    nil
                                ];
}

that has not solved the problem for me. Its worth noting that all of my calls to the texture happen in OnPostRender. My camera calls a function to perform the texture manipulation.

EDIT: For clarification the function below is located in the main camera, while the function it calls is in a script used to create the gui

function OnPostRender () {

if(guiObj.GetComponent(TestGui).grab)
	guiObj.GetComponent(TestGui).OnPostRender();

}

If anyone has any experience with an issue like this, some help would be greatly appreciated. It’s quite frustrating that the code works on all devices but my iOS ones -_-

I am using unity 3.5.7f6

I have the same issue, Texture.ReadPixels is returning a black texture on my iOS device, Pass a day looking for a workaround but with no success :frowning:

Try turning anti aliasing off