I am trying to write a plugin for Unity that does the following on iOS using OpenGL ES 2.0:
- Read camera image
- Draw the camera image into an FBO
- FBO is rendered to a Unity texture.
- Do stuff with texture in Unity.
The plugin is called in the update function of a script attached to the main camera.
The sticking point comes when I try and draw the textured quad in the offscreen FBO with
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4)
. If I comment out this line the app runs, my 3d scene is drawn correctly and runs at full framerate. All that is missing is the texture. If I enable the line, nothing draws and everything runs at about 3fps. No errors are reported.
This is the complete draw function:
- (void) drawVideoFrame
{
GLint nCurrentProg;
glGetIntegerv(GL_CURRENT_PROGRAM, &nCurrentProg);
GLint nCurrentFrameBuffer;
GLint nCurrentRenderBuffer;
glGetIntegerv(GL_FRAMEBUFFER_BINDING, &nCurrentFrameBuffer );
glGetIntegerv(GL_RENDERBUFFER_BINDING, &nCurrentRenderBuffer );
glBindFramebuffer(GL_FRAMEBUFFER, nFBOBuffer);
glBindRenderbuffer(GL_RENDERBUFFER, nFBORender);
glViewport(0, 0, videoWidth, videoHeight);
static const GLfloat squareVertices[] = {
-1.0f, -1.0f,
1.0f, -1.0f,
-1.0f, 1.0f,
1.0f, 1.0f,
};
static const GLfloat textureVertices[] = {
0.0f, 1.0f,
1.0f, 1.0f,
0.0f, 0.0f,
1.0f, 0.0f,
};
// Use shader program.
glUseProgram(passThroughProgram);
glBindTexture(GL_TEXTURE_2D, nTexID);
// Update attribute values.
glVertexAttribPointer(ATTRIB_VERTEX, 2, GL_FLOAT, 0, 0, squareVertices);
glEnableVertexAttribArray(ATTRIB_VERTEX);
glVertexAttribPointer(ATTRIB_TEXTUREPOSITON, 2, GL_FLOAT, 0, 0, textureVertices);
glEnableVertexAttribArray(ATTRIB_TEXTUREPOSITON);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
glUseProgram(nCurrentProg);
glBindTexture(GL_TEXTURE_2D, 0);
glBindFramebuffer(GL_FRAMEBUFFER, nCurrentFrameBuffer);
glBindRenderbuffer(GL_RENDERBUFFER, nCurrentRenderBuffer);
}
The fragment and vertex programs are just simple pass throughs:
Fragment
varying highp vec2 coordinate;
uniform sampler2D videoframe;
void main()
{
gl_FragColor = texture2D(videoframe, coordinate);
}
Vertex
attribute vec4 position;
attribute mediump vec4 textureCoordinate;
varying mediump vec2 coordinate;
void main()
{
gl_Position = position;
coordinate = textureCoordinate.xy;
}
I took my plugin code and implemented it as a stand alone app, doing the same thing, and everything worked.
Note:
- I am aware that this example is slightly pointless, but it is a minimal example
- I can read the camera image and upload it directly to a unity owned texture, but that is not what I want to do here.
Does anyone have any idea what is wrong?