Skip to main content
Inspiring
April 19, 2013
Question

How can I apply camera live filters?

  • April 19, 2013
  • 2 replies
  • 1427 views

I want to apply live filters on a smartphone camera

(Android and iOS), so that you can see the filtered

images on the display in real time.

To get the camera images on the

screen I've read I can use this code:

var cam:Camera = Camera.getCamera();

var vid:Video = new Video();

vid.attachCamera(cam);

addChild(vid);

But how can I manipulte the camera pictures

(turn them, set other colors, ...) real-time?

Have I to manipulate the Camera object before

the attachement to the Video object?, like this:

class cam_fx extends Camera

{

    public function filter1()

    {

    //manipulate image

    }

    public function filter2()

    {

    //manipulate image

    }

}

var cam:Camera = Camera.getCamera();

var cam_fx:Camera = Camera.getCamera();

var vid:Video = new Video();

cam_fx = cam.filter1();

vid.attachCamera(cam_fx);

addChild(vid);

And how can I then access the camera images in my filter routines?

This topic has been closed for replies.

2 replies

April 21, 2013

hello. did you succeed afterall ?

BaCbDcAuthor
Inspiring
April 22, 2013

> hello. did you succeed afterall ?

No, I did not code one row so far. I'm only considering what would be the best

attempt to write a multiple devices photo app.

sinious
Brainiac
April 20, 2013

To do this you should look into getting the stream into something you can manipulate on a pixel by pixel basis to run filters, like BitmapData objects. You'd draw the stream into the object, then run filters, then display. This will be a very intensive task however. An example of drawing a camera into a BMD can be found here:

http://stackoverflow.com/questions/9888799/resizing-bitmapdata-in-actionscript-3

That example is really just resizing video but you get the general idea. Once you have the video frame in your BitmapData the sky is the limit on the filters you apply to the data. 

The less intensive but far more complex task would be to use a Stage3D accelerated framework which accepts a NetStream source as the basis for a material. The calculations on the video could be wrapped into pixel shaders. This is quite a bit more complex but would at least use the GPU. However note the Camera class and Stage3D requires using direct (not GPU) acceleration (just a side note) on iOS devices.

BaCbDcAuthor
Inspiring
April 20, 2013