Skip to main content
chris.campbell
Legend
August 6, 2014
Question

Using StageVideo and Stage3D in AIR - BETA

  • August 6, 2014
  • 35 replies
  • 25858 views

We're planning on introducing a new ActionScript feature that will allow hardware accelerated video to be used as a source texture in a Stage3D environment.  Currently, using video with Stage3D requires the use of the Video object, which is not accelerated, and manipulation of the bitmap representations of the video frames.  The planned feature, called VideoTexture, will allow direct access to a texture object that is sourced from a Netstream or Camera object.

The following sample code demonstrates the use of the VideoTexture object.

1) Create a VideoTexture object and attach a NetStream (or Camera) object to the VideoTexture object:

var ns:NetStream;

var context3D:Context3D;

var texture:VideoTexture;

texture = context3D.createVideoTexture();

texture.attachNetstream(ns);

ns.play("video.3gp");

texture.addEventListener(VideoTexture.RENDER_STATE, renderFrame);

2) A texture representation of the current video frame can be retrieved from the callback function for the VideoTexture.RENDER_STATES event.

function renderFrame(e:Event):void

{

    // Render on stage3D with VideoTexture

}

Please note that this will be an "extended beta" feature.  The initial implementation, available  in our AIR 15 beta, will be for Windows/AIR only.  We are committed to expanding this feature to AIR mobile and Mac in a following release.  We'll also consider an implementation in Flash Player if there is sufficient demand.

Getting your feedback will be critical to making sure we're providing a solution that you can use.  Please let us know what you think and keep an eye out for updated beta release notes in the next couple of weeks.

This topic has been closed for replies.

35 replies

itlancer
Inspiring
September 14, 2014

We just created a few applications (video panorama player, media gallery and some experiments with Starling framework) with new AIR 15 beta release. I glad to notice this is amazing and powerful feature!

For example, earlier for video panorama 360 we draw Video object every frame to BitmapData and recreate texture. It was 8-12 FPS for 4K video. With VideoTexture it has stable 60 FPS and it remains a lot of time for another application logic (profiling through Scout).

Also now we can create awesome effects using GPU accelerated FLV video with alpha channel. It works cool too.

But I found 1 crash issue and 2 bugs:

Bug#3824231 - [Windows] Closing NetStream attached to VideoTexture causes crash

Bug#3824236 - [Windows] VideoTexture green color blink before playback

Bug#3824242 - [Windows] VideoTexture black color blink after playback

Please fix it and continue implementation. We really need it as soon as possible on mobile platforms too.

Another thing we need - some way to get current frame BitmapData from GPU accelerated video without draw Video object. May be it could be a get directly from NetStream, may be from VideoTexture.

It could be very useful for make screenshot from video or edit texture, draw something over it. Now we have not way to get BitmapData from video attached to VideoTexture.

By the way, Media Encoder CC 2014 doesn't support FLV video any more. But we often use video with alpha channel in our AIR applications for cool effects. With VideoTexture we have more arguments to use it. But what we need to do if new Media Encoder not support it anymore?

Is there another video format with alpha channel we can use in AIR applications?

I know uncompressed MOV support alpha channel, but output files extremely large.

Participant
February 7, 2015

itlancer написал(а):

Also now we can create awesome effects using GPU accelerated FLV video with alpha channel. It works cool too.

Can you share an example project with VideoTexture created from FLV video with alpha channel?

itlancer
Inspiring
February 7, 2015

Here example using VideoTexture from FLV video with alpha channel: itlancer/VideoTextureFLV · GitHub

Actually, it's additive blending: Context3DBlendFactor - Adobe ActionScript® 3 (AS3 ) API Reference

context3D.setBlendFactors(Context3DBlendFactor.ONE, Context3DBlendFactor.ONE);

And FLV video with alpha channel looks partially transparent as mentioned here: Why is VideoTexture displaying my camera and flv streams as partially transparent?

May be exists better way to stream FLV with alpha channel as VideoTexture, but I don't know yet.

About bug 3935044: confirmed, it's fixed in AIR 17 beta.

Known Participant
September 6, 2014

Alright.

Maybe I'll wait and see what Adobe comes up with before I throw good ol' Flash Builder away.

You never know...  :-)

Known Participant
September 6, 2014

I FINALLY GOT IT WORKING ON MOBILE USING HTML5, CSS, JAVASCRIPT AND WEBGL.


• 24+ fps (solid)

• H.264 mp4

• 3000 kbps

• 2048 x 1024


WOOOOOOOOW!! WOOOOOOOOW!! WOOOOOOOOW!!


Flash Builder is going in the trash. Sorry.

Participating Frequently
September 6, 2014

LOL! Who let you in?

Known Participant
September 5, 2014

For what it worth...

I've been able to accomplish H.264 GPU rendering (on a 3D object) using HTML5 and WebGL. Works out of the box. Also, now that WebGL is being implemented on mobile (iOS8 + Android), I'll be able to package the same web project material (HTML5, JS, CSS) into mobile apps.

Here's a working demonstration: http://www.lucid.it/next

(obviously this does not yet work on mobile)

Colin Holgate
Inspiring
September 5, 2014

The wireframe rotates nicely on my iOS 8 iPad. What would the reason be for the video not showing?

Known Participant
September 5, 2014

@ Colin,

It could be a myriad of reasons. Video format. Large video dimensions. Unrefined code (on my part). I believe with HTML5 and WebGL, I'll be able to (eventually) get this working nicely on mobile. The best part is that I can use the same code base and files! This was one of the primary reason I enjoyed Flash and AIR - the whole 'write once, deploy anywhere' idea.

Known Participant
September 1, 2014

√ AIR for Windows

X AIR for Android

X AIR for iOS

X AIR for Windows OS

X AIR for Mac OS

Participant
September 1, 2014

I see no reason what so ever to limit this to AIR. This is clearly not a security risk.

Also, the API here is unclear. Does the VideoTexture class extend from flash.display3D.textures.Texture? Either way, what does the event actually do? Tell us that there is a new video frame decoded? If so, do we have to listen for it? Will the texture remain available once the event propagation completes?

Participating Frequently
August 27, 2014

H.264 in Stage3D on AIR 'mobile' is simply "Awesome news"

Participant
August 19, 2014

This is terrific news!

One vote here for support on Flash Player.

Would love to be able to use AGAL on Stage3D to manipulate VideoTextures in realtime like PixelBender used to allow.

Pixel Bender + Video = Killer Runtime Effects | Brooks Andrus

Participant
August 17, 2014

This is fantastic news! This would be very valuable on iOS and Android. There are current workarounds for getting video into stage 3D,  which work well on powerful desktops, but the more limited mobile platforms would benefit the most.

Known Participant
August 14, 2014

Hmm. Seeing how I'm using a Mac and primarily want to make mobile apps, it looks like I'm going to be waiting a bit until I can start working with this.

I want to repeat one question here as well - Will this feature allow us to use an H.264 video file as a texture on an Away3D object? (in AIR Mobile)

Thanks again for moving in this direction, I'm very hopeful!

Anyone interested in this can also take a look here: https://bugbase.adobe.com/index.cfm?event=bug&id=3744843

Thanks Chris & Adobe!

Jason

chris.campbell
Legend
August 19, 2014

@Jason - Yes, the plan is to allow for H.264 in Stage3D on AIR mobile.

Participant
August 19, 2014

HUGE!!!