OpenGLES error 0x0502

I have a project that uses two cameras: one for the game scene, one for the UI. They have mutually-exclusive culling masks (the UI camera only renders the UI layer, while the game camera renders everything else), and the UI camera clears depth only and is set to render on top of the game camera. This setup works fine on all platforms.

I added a custom image effect to the UI camera which does a screen fade (it literally just blends between the source texture and a solid color). This setup fails on iOS: I get a black screen and Xcode spams “OpenGLES error 0x0502”. There is no additional information.

I Googled that error and found a bunch of instances of it, but they’re all old threads (circa 2011/12) and there’s no particular trend among them. Most were happening with very unusual rendering setups, mostly involving multiple custom render textures, and were solved by weird things like restarting the iOS device or reinstalling Unity.

None of those solutions work for me. The only things that do work are:

  • Disabling/removing the image effect (works, but is undesirable because, hey, I added it for a reason!)
  • Using only one camera (I can put the image effect on a single camera and it works, but then I don’t have the clean game/UI separation)
  • Checking “Use 32-bit display buffer” in player settings (works with the full setup, but absolutely wrecks the frame rate)

I also can’t even find any general information on the error separate from Unity, as the opengl.org docs seem to be unreachable. :frowning:

Help?

Did you find a solution to this?

Sadly, no. I ended up working around it instead. Because my use case was a simple screen fade, I just nuked the offending image effect entirely and implemented the fade in an OnGUI on the UI camera.

I’d really love to know wtf the actual issue is/was though. :frowning: