Skip to main content
Participant
August 1, 2013
Question

Using OpenGL ES in an AIR native extension on iPad

  • August 1, 2013
  • 2 replies
  • 1495 views

Hi,

I am trying to find out whether I can render images using iPad GPU from a native extension loaded by an AIR application on iPad. The main reason for this is that I need every bit of GPU power in my AIR application and I cannot use Stage3D (see http://forums.adobe.com/thread/1267084 for details).

The idea is to pass a bitmap from the ActionScript code to the native extension, render it by Objective C code and raw OpenGL ES and send it back to the AIR code.

Is it technically possible? I am afraid that AIR runtime uses OpenGL ES for its own needs (at least for Stage3D support) so native extension possibly cannot share OpenGL with it.

Nevertheless I have made a try. Here is some code:

The first strange thing is that the following initialization code:

myContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];

[EAGLContext setCurrentContext:myContext];

does not make much sense for me. Instead of this I can see that EAGLContext already contains some instance previously set up by someone else (maybe AIR runtime did it). And I was able to get an image only when I do not create this context at all. So these two lines are actually commented out in my test app.

Here is how I initialize framebuffer:

glGenFramebuffersOES(1, &framebuffer);

glBindFramebufferOES(GL_FRAMEBUFFER_OES, framebuffer);

glGenRenderbuffersOES(1, &colorRenderbuffer);

glBindRenderbufferOES(GL_RENDERBUFFER_OES, colorRenderbuffer);

glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_RGBA8_OES, width, height);

glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, colorRenderbuffer);

I do not need 3D so I am not creating depth buffer. Insteat I need to render a lot of 2D polygons and the drawing order is OK for me.

Then I tried the following code to render a single triangle specified in the vertexData array:

glMatrixMode(GL_PROJECTION);

glLoadIdentity();

glOrthof(-1.0f, 1.0f, -1.0f, 1.0f, -1.0f, 1.0f);

glMatrixMode(GL_MODELVIEW);

glClearColor(1.0f, 1.0f, 1.0f, 1.0f);

glClear(GL_COLOR_BUFFER_BIT);

// vertexData contains 2d float coordinates followed by 4 bytes of color for each vertex

glEnableClientState(GL_VERTEX_ARRAY);

glVertexPointer(2, GL_FLOAT, 12, vertexData);

// The following two lines cause the whole screen to be filled with random gradient semi-transparent fill.

// If I comment them out then it renders a black triangle that I really expect to get.

glEnableClientState(GL_COLOR_ARRAY);

glColorPointer(4, GL_UNSIGNED_BYTE, 12, (vertexData + 8));

// Draw the triangle:

glDrawArrays(GL_TRIANGLES, 0, 3);

// Get the final image:

glPixelStorei(GL_PACK_ALIGNMENT, 4);

NSInteger datalength = width * height * 4;

GLubyte* rawdata = (GLubyte*)malloc(datalength * sizeof(GLubyte));

glReadPixels(x, y, width, height, GL_RGBA, GL_UNSIGNED_BYTE, rawdata);

// In this point the rawdata array contains an image that I am able to convert and send back to my AS code.

So each time I try to specify color for vertexes then I get the whole iPad screen filled with some random gradient. I have also tried glColor function and it causes such effect too. Disabling lighting and fog did not helped.

So my main question is the following: is it technically possible to render an offscreen image in the native extension using OpenGL?

Maybe the black triangle that I was able to get from the rendered image is rendered accidentally and the whole thing should not work at all?

This topic has been closed for replies.

2 replies

Inspiring
October 8, 2014

Hi there,

I'm a total OpenGL newb but, after quite some struggle I managed to bend OpenGL from AIR extension to do what I wanted in my FlashyWrappers library for iOS (for superfast video capturing from AIR apps).

Basically you're right in the assumption that the OpenGL context is already initialized by AIR. I'm not sure about your code, because like I said I'm practically a newb in that field, but overall, because you share context with AIR's OpenGL, it's entirely possible you're messing with it's own rendering pipeline if you don't properly unbind / reset stuff and that might cause the screen go crazy.

Myself I was able to do a little different thing: Create a texture-backed FBO, bind it on every frame and let AIR render into it (instead of screen). I was then able to manipulate the contents of what AIR rendered (you can do glReadPixels on the content too, by the way).

So I bet if I can let AIR render its stuff into my texture backed FBO you can render your own things into it as well...I just bound it and waited for AIR to finish its rendering into it. Instead you can probably render into it, then bind AIR's original FBO instead of waiting for AIR to render so that AIR renders to screen again on the next frame.

I partially used this tutorial OpenGL ES 2.0 for iPhone Tutorial Part 2: Textures - Ray Wenderlich .

Oh and I would like to add that playing with this works only in GPU or Direct mode, not CPU.

Inspiring
February 5, 2014

Hi, did you ever find a solution for this problem? Did you get it working?