Copy link to clipboard
Copied
Hey,
I'm new to AIR mobile development although i have around a decades experience with Flash, but with AIR for mobile i'm a bit lost. So i was hoping someone can answer these questions for me...
1. Is there currently no way to use a phones vibration with AIR? I remember seeing people asking this on these forums last year, and it still looks as if this extremely simple thing cannot be done? And if it cant, are there plans to have it in AIR 2.7?
2. When i select GPU acceleration, does it cache all bitmaps automatically? Or do i still have to manually select "Cache as Bitmap" for everything i want cached? (i'm using Flash Pro CS5.5). Because it seems to have no effect on frame rates whatever i do here as long as GPU render mode is selected.
3. Would i get better performance using vectors or bitmaps for graphics?
4. Whats the texture size limit for Android and iOS before it can no longer be cached?
5. Does it matter performance wise if JPG's and PNG's are used for graphics instead of bitmaps? (when i've not selected Cache as Bitmap).
6. Whats the differences between a AIR for Android and AIR for iOS? I've noticed that for Android it supports use of the camera, for iOS it dont. Is there anything else?
7. How can i embed a HD video in an AIR for Android app? i've seen this thread: http://forums.adobe.com/thread/849395?tstart=0
but it's for AIR on iOS, i wouldn't think it's any different for Android, but i'm getting a blank screen when i use the code from that thread (and i'm testing on a Android phone too, not in Flash > Test Movie).
Or if anyone has ANY tips for performance at all then i'd love to hear them
1. I don't think there is a way to trigger the vibrate.
2. There are two types of caching, cacheasbitmap and cacheasbitmapmatrix. With cacheasbitmap, whatever you are caching is turned into a bitmap version that can be more quickly read when creating the final scene. Bitmaps don't need that doing as they are bitmaps already. Cachasbitmapmatrix is the part that makes the bitmap be handled by the GPU. If you have done cacheasbitmap and cachasbitmapmatrix on a displayobject, then in most cases it's
...Copy link to clipboard
Copied
1. I don't think there is a way to trigger the vibrate.
2. There are two types of caching, cacheasbitmap and cacheasbitmapmatrix. With cacheasbitmap, whatever you are caching is turned into a bitmap version that can be more quickly read when creating the final scene. Bitmaps don't need that doing as they are bitmaps already. Cachasbitmapmatrix is the part that makes the bitmap be handled by the GPU. If you have done cacheasbitmap and cachasbitmapmatrix on a displayobject, then in most cases it's residing on the GPU, and doing any translation rotation or scaling on it should be fast.
3. If you are using GPU and are doing cacheasbitmap/cacheasbitmapmatrix correctly, it shouldn't matter whether it was intially vectors or bitmaps. If you're using CPU, then bitmaps would probably be faster.
4. Don't know the limits for Android, but iPhone 4 and iPad is 2048 pixels. Earlier devices are 1024 pixels. If a texture is bigger than that, it won't be cached, and will constantly be transferring into GPU memory.
5. Historically, I think that bitmaps are decompressed before they are used. If that's still the case then it shouldn't matter whether it was JPEG or PNG. If the bitmap is only decompressed when it's needed, then maybe there would be a difference in memory usage or performance, even between different levels of compression.
6. In CS5 there was a long list of differences between Android and iOS abilities. With CS5.5 there are only a few differences, and it's not all cases where Android does more. iOS can let you use either of the two cameras for example.
7. On and Android tablet I get whiteness too. StageWebView works differently on Android, and I can't quite see how to make it work. I have a friend who wrote a book on developing for AIR on Android, I'll ask her!
Copy link to clipboard
Copied
Hello,
I am the friend and I have been paying attention 🙂
I think I can answer all your questions.
I will try to reply a little later today when I get home.
Message was edited by: Veronique Brossier
Copy link to clipboard
Copied
While figuring out the whiteness, you could let me try on my test file to confirm that it fixes it for me.
I left your book at work, otherwise I would have checked in there already!
Copy link to clipboard
Copied
x
Message was edited by: Veronique Brossier
Copy link to clipboard
Copied
Cheers Colin! very helpful.
2 last questions i promise....
With cacheasbitmapmatrix would it improve performance if it was used on objects that move around the stage but do not change shape in any way or get rotated? (like a square moving from X to Y). I'm guessing it would.
And does CS5.5 use cacheasbitmatrix automatically in any cases when GPU render mode is enabled, or do i always have to set it myself for whatever i want it used on? because so far it's making no difference for me in tests... The only thing thats making a clear differece is switching between CPU and GPU render modes. It's as if GPU mode is using cacheasbitmatrix automatically... but isn't that what GPU render mode in CS5.5 is meant to do anyway? Because what else would it use the GPU for otherwise?
@Veronique thanks! looking forward to your reply.
Copy link to clipboard
Copied
Although cacheasbitmapmatrix is implied to only be needed for scaling and rotation changes, I think it helps in general, especially if there are other things on top of the object in question.
I too would think that GPU meant it automatically used the GPU, but it never behaves as if that is the case. It takes a significant amount of time for things to cache onto the GPU, and they use a lot of memory in some cases, so it's probably best if you handle that yourself.
There isn't a cacheasbitmapmatrix checkbox, so I think it always has to be done using code.
I still don't know a release date for AIR 2.7, but the performance that the AIR team have implied for that, when speaking at conferences, makes it worth getting your whole app finished now, and then come back to performance issues at the end. If 2.7 is out by then you could save yourself weeks of effort trying to make it work well under 2.0 or even 2.6.
Copy link to clipboard
Copied
Thanks again I'll have to do more testing and see if i can get cacheasbitmapmatrix to actually make a difference to frame rates. I think a problem is the hardware i'm using to test, it's a Samsung Galaxy S2, the fastest phone around at the moment, faster than most tablets too, and it just runs all of my games and quick tests perfectly at a constant 60fps in GPU mode. I'll have to do something more demanding...
Do you have any links about the performance for AIR 2.7? I cant find much stuff about it, mostly other info.
Copy link to clipboard
Copied
When you do try lesser devices, don't think of success meaning it must be 60 fps. Even 10 fps can seem smooth enough for scrolling lists. For game play it's nice to be smoother, but 30 fps ought to be fine. Just about every movie you've ever seen is 24 fps, and those seem smooth enough.
Read this thread:
http://forums.adobe.com/message/3661417#3661417
There are a few pointers about AIR 2.7 in there.
Copy link to clipboard
Copied
Thanks for link!
And film is different to computer generated graphics, because with film it blurs things in a single frame, like when theres fast movement, which fools the brain into making it look smoother than 24fps. Games/software and anything else dont blur frames, so for games for example you have to use around 40+fps to get roughly the same smooth feel as film at 24p. Most games on consoles aim for 60fps and not a single game runs at 24fps because it would look appalling.
It might be more than is needed for most Flash games but i think it gives a higher quality feel because it's easy to see the difference between 30 and 60fps in Flash.
Copy link to clipboard
Copied
Hello,
I think that Colin may have answered a lot of your questions but here it is:
1. There is currently no way to make the phone vibrate for within AIR. You can read this post on how to create an Android/AIR hybrid application which gives you extension access to some native functionality. Warning: it is somewhat advanced and not supported by Adobe
http://www.jamesward.com/2011/05/11/extending-air-for-android/?replytocom=163323
You can expect to have this functionality in AIR at some point in the future, perhaps not in 2.7
2. in AIR for Android, there is no need to set GPU if you are only using bitmaps. Cache As Bitmap is only needed for vector art.
3. Bitmaps would be better but performance is not the only consideration. Remember that memory is limited so bitmaps can use it all so vectors may be a good option if you have a lot of art. A great advantage of using AIR is that you have the option between the two. My recommendation would be that if your art is going to be animated, create it as bitmap or use cacheAsBitmap if it only moves on x and y axis and use CacheAsBitmapMatrix if it transforms in other way (rotation, size).
4. The limit on Android was 1,024×1,024 last time I checked.
5. JPGs are smaller in size but PNGs look better. A side note, because the PPI is so high on devices, everything looks great so you can get away with lesser quality assets. Test all options.
6. The differences between the two platforms are minimal. Camera first but also Android lets you access the media library (where all the images taken by the device are stored) where iOS do not. In general Android is more open.
7. Video is a very large topic. If you embed the video, you will have a black screen until it is fully loaded. I should say that I did a project with a 1 GB movie and it loaded right away. I have not used the StageWebView to display the movie. Keep in mind that the way the video is endoced matters a great deal (use baseline profile).
Here is some sample code:
import flash.net.NetConnection;
import flash.net.NetStream;
import flash.media.Video;
import flash.events.NetStatusEvent;
var connection:NetConnection;
var video:Video;
video = new Video();
video.width = 480;
video.height = 320;
connection = new NetConnection();
connection.addEventListener(NetStatusEvent.NET_STATUS, netConnectionEvent);
connection.connect(null);
function netConnectionEvent(event:NetStatusEvent):void {
event.target.removeEventListener(NetStatusEvent.NET_STATUS,
netConnectionEvent);
if (event.info.code == "NetConnection.Connect.Success") {
var stream:NetStream = new NetStream(connection);
stream.addEventListener(NetStatusEvent.NET_STATUS, netStreamEvent);
var client:Object = new Object();
client.onMetaData = onMetaData;
stream.client = client;
// attach the stream to the video to display
video.attachNetStream(stream);
stream.play("someVideo.flv");
addChild(video);
}
}
function onMetaData(info:Object):void {}
I would recommend looking at my book. I cover a lot of these topics in detail (and went through the same hurdles as you):
http://oreilly.com/catalog/0636920013884
(you can get it on Amazon too).
Copy link to clipboard
Copied
Nice post Veronique
For the video code i'm getting "Access of undefined property netStreamEvent" but when i changed it to netConnectionEvent it worked. Although it's not running very well... the device is capable of playing full 1080p, but even a 720p video (baseline) is choppy. It's like it's not using the GPU for the video.
Is it possible to use Stage Video with AIR for Android with Flash Pro CS5.5? (http://www.adobe.com/devnet/flashplayer/articles/stage_video.html)
Copy link to clipboard
Copied
You can use stagewebview class for the better performance, Let give it a try:
import flash.geom.Rectangle;
import flash.media.StageWebView;
import flash.filesystem.File;
var webView:StageWebView = new StageWebView();
webView.stage = this.stage;
webView.viewPort = new Rectangle(0,0,stage.stageWidth,stage.stageHeight);
var fPath:String = new File(new File("app:/MyMovie.mp4").nativePath).url;
webView.loadURL( fPath );
Copy link to clipboard
Copied
I've already tried that, apparently it's working on iOS for people. But it's not working for me on Android.
Copy link to clipboard
Copied
One of the main advantages of using StageWebView for just playing video is that it uses hardware acceleration to play the H.264 video. The way Véronique showed is the standard way of playing On2 VP6 FLV, which as you've noticed doesn't use hardware for decompressing.
I think I read a post here talking about StageVideo being in a future AIR release, so one day we can cut out the StageWebView middleman!
Copy link to clipboard
Copied
My method uses H.264 and takes advantage of hardware acceleration.
For video, do not set your application to GPU. The hardware acceleration for video is handled automatically by the device, you don't need to do anything. In fact, it is recommended not to set it up. Note that this is very much device specific (not something AIR can dictate). I have done most of my testing on a Samsung Galaxy Tab with very good results.
For video too, you can take advantage of high PPI.
For the StarTrek video in this video, I made the Video object 960 x 405 but the video footage is actually 640 x 272:
http://blog.everythingflex.com/2010/10/27/2010-max-sneak-peak-10-videos/
(scroll down to Video Tapestries)
StageVideo is not yet available for Android but may not be far away.
Copy link to clipboard
Copied
I'm using a 720p H.264 baseline video and hardware acceleration using your method dont seem to be working on my Android device (with latest Flash Player 10.3 and AIR 2.6 installed). The video is way too choppy to watch, yet the hardware in the Galaxy S2 phone that i'm using to test on is considerably faster than something like the Galaxy Tab that you used. It can playback 1080p no problem and 720p on youtube with Flash is great.
Another issue is that the video will play before a game, so i need to use GPU render mode for the game, but you say not to use it for video?