I’m writing an augmented reality app. So far I’ve got the camera data and am rendering it to a GUITexture in a plugin using glTexSubImage2D. But it comes out flipped and rotated because of the different coordinate systems used by the iPhone and OpenGL.
So I’ve looked at:
- Rotating the view and flipping/rotating the image using CGContext.
- Drawing a textured quad and transforming that.
Number 1 didn’t behave itself. The CGContext stuff didn’t have any effect at all. So I’m doing something wrong there.
I’m stumped on number 2 because of my lack of knowledge of GLES 2.0. I can’t just set verts, texs and render anymore. I just want to do a glDrawArrays().
This is compounded by:
- I’ll need get a better quality image and only render a portion of the texture when it is rotated.
- The camera capture using AVFoundation does funny rotation stuff. If I turn the phone 180 the image goes through 360!
Any hints or ideas?
John