Copy link to clipboard
Copied
I want to apply live filters on a smartphone camera
(Android and iOS), so that you can see the filtered
images on the display in real time.
To get the camera images on the
screen I've read I can use this code:
var cam:Camera = Camera.getCamera();
var vid:Video = new Video();
vid.attachCamera(cam);
addChild(vid);
But how can I manipulte the camera pictures
(turn them, set other colors, ...) real-time?
Have I to manipulate the Camera object before
the attachement to the Video object?, like this:
class cam_fx extends Camera
{
public function filter1()
{
//manipulate image
}
public function filter2()
{
//manipulate image
}
}
var cam:Camera = Camera.getCamera();
var cam_fx:Camera = Camera.getCamera();
var vid:Video = new Video();
cam_fx = cam.filter1();
vid.attachCamera(cam_fx);
addChild(vid);
And how can I then access the camera images in my filter routines?
Copy link to clipboard
Copied
To do this you should look into getting the stream into something you can manipulate on a pixel by pixel basis to run filters, like BitmapData objects. You'd draw the stream into the object, then run filters, then display. This will be a very intensive task however. An example of drawing a camera into a BMD can be found here:
http://stackoverflow.com/questions/9888799/resizing-bitmapdata-in-actionscript-3
That example is really just resizing video but you get the general idea. Once you have the video frame in your BitmapData the sky is the limit on the filters you apply to the data.
The less intensive but far more complex task would be to use a Stage3D accelerated framework which accepts a NetStream source as the basis for a material. The calculations on the video could be wrapped into pixel shaders. This is quite a bit more complex but would at least use the GPU. However note the Camera class and Stage3D requires using direct (not GPU) acceleration (just a side note) on iOS devices.
Copy link to clipboard
Copied
Thank you. I also found this http://ohnit.wordpress.com/2009/11/17/webcam-3/
example to set filter and that there are some filter classes http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/filters/package-detail.html.
Copy link to clipboard
Copied
> the Camera class and Stage3D requires using direct (not GPU) acceleration
> (just a side note) on iOS devices.
Do you know what it means practically for programming iOS devices?
Can i use a command like if os="iOS" then "use direct acceleration" else use "gpu acceleration" in my program? Or has the user to change a general setting for all apps
"use direct acceleration" on his iOS device?
Copy link to clipboard
Copied
You choose the acceleration mode when compiling your app. Direct is usually the best choice anyhow. It offers Software and GPU modes and the device/app makes the decision as far as what to use. If you specify GPU you're forcing it into GPU mode which may render some older devices unable to run your app, although very few these days and nothing I know of that's modern. I use direct on every single project I do, because I use Stage3D as much as possible.
Copy link to clipboard
Copied
hello. did you succeed afterall ?
Copy link to clipboard
Copied
> hello. did you succeed afterall ?
No, I did not code one row so far. I'm only considering what would be the best
attempt to write a multiple devices photo app.
Find more inspiration, events, and resources on the new Adobe Community
Explore Now