Highlighted

Quick way to grab stage screenshot/framebuffer

Contributor ,
Sep 25, 2014

Copy link to clipboard

Copied

Hey, I'm working in video encoding ANE - all is working great on iOS, except one HUGE bottleneck.

Grabbing 1 frame of application composed of MovieClips/layers etc. is super slow, because the only way I know around this is using BitmapData.draw, which of course redraws everything, including the effects etc., which is really slow for example on iPad 3.

In my case grabbing the frame as fast as possible is super important as I'm trying to achieve realtime video encoding. On the ANE side everything is optimized, using AVFoundation with pixel buffer pools etc. but AIR is killing it with the draw method.


When fooling around in the ANE, I made a list of views AIR is using on iOS, and  saw the whole app is running in GL view. Could AIR provide a method to grab GL view somehow ? I tried hacking around it from the ANE(glReadPixels, which I know is slow but doesnt work anyway because the framebuffer is empty by the time Im calling it from ANE). Theres also a way to attach texture to framebuffer I think and have the content always ready there, which would be great - I'm not sure I can bind that texture to AIR's framebuffer though(didnt test this yet, but seems unlikely).

I even managed to grab a screen from ANE, there's a new method in iOS7 which can grab all the UIViews, but that is even slower than AIR unfortunately.


+I'm OpenGL newb. It would be cool if AIR took advantage of running in GL layer and provided some clean method for getting access to memory location with some texture/image from GL which could be used as screenshot/movie frame etc.



Answering to myself, I managed to solve this after reading tons of tutorials and SO questions/answers

Basically you need to create your own texture-backed FBO inside the ANE. Inside the ANE you have the AIR's OpenGL context which is a good start.

Whenever you want to capture frames directly from AIR, you need to intercept AIR's rendering - you call into the ane on ENTER_FRAME event, bind your texture-backed FBO. Let AIR render into it (luckily AIR WON'T rebind it's own FBO if you present any FBO at least once, I don't know why exactly). So AIR renders happily into your texture-backed FBO, on next frame you can just access the data with glReadPixels or do even funkier things if you need to.

It is also possible to render this texture back to screen to "emulate" AIR's rendering because you loose it for a while, when capturing. You need to render a quad with this texture. It's also good to remember the FBO ID of AIR (before you bind your own) and when done with capturing bind AIR's FBO back so it can resume its normal rendering.

This is much faster than relying on BitmapData.draw and to deliver pixels into your ANE in general, especially for me as I"m also using texture caches so I dont even need to call glReadPixels which in itself isn't the fastest method around...

TOPICS
Air beta

Views

565

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more

Quick way to grab stage screenshot/framebuffer

Contributor ,
Sep 25, 2014

Copy link to clipboard

Copied

Hey, I'm working in video encoding ANE - all is working great on iOS, except one HUGE bottleneck.

Grabbing 1 frame of application composed of MovieClips/layers etc. is super slow, because the only way I know around this is using BitmapData.draw, which of course redraws everything, including the effects etc., which is really slow for example on iPad 3.

In my case grabbing the frame as fast as possible is super important as I'm trying to achieve realtime video encoding. On the ANE side everything is optimized, using AVFoundation with pixel buffer pools etc. but AIR is killing it with the draw method.


When fooling around in the ANE, I made a list of views AIR is using on iOS, and  saw the whole app is running in GL view. Could AIR provide a method to grab GL view somehow ? I tried hacking around it from the ANE(glReadPixels, which I know is slow but doesnt work anyway because the framebuffer is empty by the time Im calling it from ANE). Theres also a way to attach texture to framebuffer I think and have the content always ready there, which would be great - I'm not sure I can bind that texture to AIR's framebuffer though(didnt test this yet, but seems unlikely).

I even managed to grab a screen from ANE, there's a new method in iOS7 which can grab all the UIViews, but that is even slower than AIR unfortunately.


+I'm OpenGL newb. It would be cool if AIR took advantage of running in GL layer and provided some clean method for getting access to memory location with some texture/image from GL which could be used as screenshot/movie frame etc.



Answering to myself, I managed to solve this after reading tons of tutorials and SO questions/answers

Basically you need to create your own texture-backed FBO inside the ANE. Inside the ANE you have the AIR's OpenGL context which is a good start.

Whenever you want to capture frames directly from AIR, you need to intercept AIR's rendering - you call into the ane on ENTER_FRAME event, bind your texture-backed FBO. Let AIR render into it (luckily AIR WON'T rebind it's own FBO if you present any FBO at least once, I don't know why exactly). So AIR renders happily into your texture-backed FBO, on next frame you can just access the data with glReadPixels or do even funkier things if you need to.

It is also possible to render this texture back to screen to "emulate" AIR's rendering because you loose it for a while, when capturing. You need to render a quad with this texture. It's also good to remember the FBO ID of AIR (before you bind your own) and when done with capturing bind AIR's FBO back so it can resume its normal rendering.

This is much faster than relying on BitmapData.draw and to deliver pixels into your ANE in general, especially for me as I"m also using texture caches so I dont even need to call glReadPixels which in itself isn't the fastest method around...

TOPICS
Air beta

Views

566

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Sep 25, 2014 0
Contributor ,
Oct 08, 2014

Copy link to clipboard

Copied

Answering to myself, I managed to solve this after reading tons of tutorials and SO questions/answers

Basically you need to create your own texture-backed FBO inside the ANE. Inside the ANE you have the AIR's OpenGL context which is a good start.

Whenever you want to capture frames directly from AIR, you need to intercept AIR's rendering - you call into the ane on ENTER_FRAME event, bind your texture-backed FBO. Let AIR render into it (luckily AIR WON'T rebind it's own FBO if you present any FBO at least once, I don't know why exactly). So AIR renders happily into your texture-backed FBO, on next frame you can just access the data with glReadPixels or do even funkier things if you need to.

It is also possible to render this texture back to screen to "emulate" AIR's rendering because you loose it for a while, when capturing. You need to render a quad with this texture. It's also good to remember the FBO ID of AIR (before you bind your own) and when done with capturing bind AIR's FBO back so it can resume its normal rendering.

This is much faster than relying on BitmapData.draw and to deliver pixels into your ANE in general, especially for me as I"m also using texture caches so I dont even need to call glReadPixels which in itself isn't the fastest method around...

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Reply
Loading...
Oct 08, 2014 0