Copy link to clipboard
Copied
Hi.
I am producing a large amount of graphics and sounds for an online elearning course.
Can I just confirm one thing please?
Should I put most of my graphics that are going to be shared with other swfs into its own dedicated swf.
As it is cached I suppose that would be the best idea.
BUT
Doesn't it also take longer for a game or screen to load if it has to import the particular mc inside the large resources.swf.
Also, how exactly does it get to that mc.
For example I want mountain mc (which resides inside resources.swf) in my flyingGame.swf.
Does flyingGame have to import resources.swf and then "look inside" for mountain mc EVERYTIME.
Is this good or bad.
I getconfused as to the best way to go about this. So,ebody mentioned RSLs - is that better or am I in fact doing that already as the libraries are cached and shared anyway?
Nesting libraries inside others will do exactly as you've said, cause you to load unnecessary things. It also creates a dependancy that could be highly unnecessary. If one game is updated that uses a shared asset, all the other games will update for nothing.
To take an example for speedy web design (other than sprite sheets), it's always been highly suggested to limit the number of HTTP requests. If you need to load 50 images, 30 audio files, etc, this isn't going to work terribly efficiently a
...Copy link to clipboard
Copied
How complex are the graphics assets? Any reason not to load them off the server as regular graphics and audio rather than in a SWF? The most efficient way is going to be literally only loading what you need and nothing more. If there's tons of little graphics in games then you'll optimize it even more with sprite sheets.
Copy link to clipboard
Copied
Hi.
Well, I should say that they are not pictures as such but vocabulary.
Food: has orange, apple etc...
These mcs are used hundreds of times by different games so are imported as you comment at the end.
The other example is more for larger game apps such as a crossword I have. I import the graphics because I thought I would use them in other games and I certainly will but for now I'm thinking that there is a lag because of what I am doing.
The game has to import another swf and look for the assets it needs etc... So in that repect would it be better to put those graphics back into the game fla that requires them.
I just thought it would be a great idea especially for music because some tracks can get up to 500k. I am looping them but they are still big.
btw: when I put sounds inside a swf they get compressed even more so that should be good shouldn't it?
Copy link to clipboard
Copied
Nesting libraries inside others will do exactly as you've said, cause you to load unnecessary things. It also creates a dependancy that could be highly unnecessary. If one game is updated that uses a shared asset, all the other games will update for nothing.
To take an example for speedy web design (other than sprite sheets), it's always been highly suggested to limit the number of HTTP requests. If you need to load 50 images, 30 audio files, etc, this isn't going to work terribly efficiently as external assets.
If you can manage to get the assets on sprite sheets, I feel these are the most flexible overall, especially with an atlas. Keep in mind external files are easy to edit, don't have the SWF bloat and are also cached.
The only thing SWF gives you is development speed and perhaps a speed advantage if you're loading dozens and dozens of external assets. If you went external and found you needed to load 50-100+ assets then wrapping it in a single SWF is far more advantageous.
Audio is an ideal target to bundle in SWF if you can because each file must be a separate download. That said, there's nothing SWF is doing to make the audio any smaller than you could make it at similar settings in any other audio encoder at the right settings. MP3 is MP3, etc.
Your answer could lie in between multiple per-game strategies. SWFs for some games, external assets where possible elsewhere and a combination of both as well. It really depends on your game.
Copy link to clipboard
Copied
There is a lot for me to look into here so I am going to dedicate the next few days to this. Thanks a lot.
Copy link to clipboard
Copied
With all due respect I highly disagree with the statement that loading individual assets is less efficient than stuffing them into swfs.
Numerous benchmark tests demonstrated that loading of smaller individual assets results in faster aggregate load time than loading larger files that contain these assets.
It is a matter of personal preference, of course, but I find it highly efficient (based on both empirical evidence and inherent requests flow logic) to never include anything but pure AS code into SWFs and load everything else (images, fonts, videos, text files, etc.) at runtime. Not only this approach makes it faster to make assets available to application but also it allows for a more intelligent asset consumption on "as needed" basis.
Copy link to clipboard
Copied
Wow this post has been one of the most important for me as it's an elearning website supporting hundreds of kids to learn English in Valencia and a lot more coming on board next year. I have created this along with some freelancers for a "few bucks". Quotes for something like this can go from 100,000 euros upwards. For just one professional flash game I was quoted 5000 euros+ and from a lot of people. Well, we have created 100's so you can imagine what we have saved.
So yeah, I have s*** blood over the last two years.
BUT one thing is a little flash game and it's a whole differnt story when you talk about DESIGN PATTERNS and create a whole virtual world with an MVC setup.
Thanks to Kglad + Sinious + especially Andrei you guys have guided me through a minefield.
I am not in the industry but have been studying my butt off for the last year and have come along way. It's impossible however to grow professionally if you are not in the industry, there are too many areas which you just don't find in books and I mean loads. I can't pay 10's of thousands but my job is important and I have risen to the top of English teaching in Valencia.
This area was driving me mad as in an elearning system you are always loading resources so yeah it's difficult especially when nobody really agrees, or maybe it's just that there are a hundred ways to skin a cat.
Anyway...
BENCHMARKING
1. The big swfs with 80 graphics + 80 sounds - big but load fine. (I think I should brea kthem into 4 swfs as if I change just one graphic then the whole swf will have o be downloaded again AND I only use 10 of those graphics per game anyway
2. The massive bottlenecks have been located and it looks like it is when I import the graphics and sounds for a screen.
I though it would be a good idea to share big graphics and sound BUT I pretty much only use them in one major app anyway.
I can see that 3 libraries are used along with xmls etc... and heavy graphics at that. Well, over the weekend I will optimize those areas ie: put them all into the actual fla instead of importing them.
THANK YOU ALL VERY MUCH
Copy link to clipboard
Copied
Of course there are hundred ways to skin a cat but it is difficult to argue with math.
Class below performs a simple benchmark test that clearly demonstrates that loading multiple assets is faster than loading a single one (meant to be used as a document class).
I just found a 2MB image on Internet and a single image with file size 79MB. Code downloads 2MB once and 79MB file as many times (26, cache busted) as it takes to load approximately 2MB total.
This is a rudimentary test - it does not take into account network or server conditions. Even then, on average it ends up 25-30% faster to load 26 images than a single file with the same final size.
If you observe traffic (using Fiddler, Charles, Browser dev tool, etc.) you will see more detailed requests dynamics (Chrome dev tool has a nice requests timeline and graphical representation of request). When latency, dns resolution, blocking an waiting are filtered out you will end up with a clear picture of pure load time.
Even if is was just about comparing file sizes this observation prompts one to host any non-script files externally. The fact that individual smaller images are available for application consumption hundred times faster than if they were delivered in a single file presents us with a huge opportunity in terms of improving user experience.
In addition, having optimization goals in mind it is counter-productive to treat all mime types uniformly . Although in cases of bitmaps, texts, swfs you need to wait for a complete load, as far as streaming assets (sounds, videos, etc.) go latency between the time you request them and the moment they become useful for end user is WAY shorter than with images.
All of a sudden this is not a matter of opinion but rather how one wants to deal with these statistics. As usual, business process and development resources availability intervene with application optimization. But there is no question in my mind that in user-centric applications asset management is one of the cornerstones of success. With Internet-based application it is just not very prudent not to take advantage of this inherent capacity of granular content distribution and consumption.
package
{
import flash.display.Loader;
import flash.display.Sprite;
import flash.events.Event;
import flash.net.URLRequest;
import flash.text.TextField;
import flash.text.TextFormat;
import flash.utils.getTimer;
public class LoadBenchmarks extends Sprite
{
private var largeSTime:int;
private var largeTime:int;
private var smallSTime:int;
private var smallTime:int;
private var numToLoad:int;
private var loadCounter:int;
private var feedback:TextField;
// laarge: http://reciperhapsody.files.wordpress.com/2010/06/watermelon-tomato-salad-5-31-10.jpg - 2MB
// small: http://www.finecooking.com/CMS/uploadedimages/Images/Cooking/Articles/Issues_111-120/051118020-02-mi... - 79kb
public function LoadBenchmarks()
{
init();
}
private function init():void
{
drawFeedback();
loadLarge();
loadSmall();
}
private function drawFeedback():void
{
feedback = new TextField();
feedback.defaultTextFormat = new TextFormat("Arial", 12);
feedback.multiline = feedback.wordWrap = true;
feedback.width = stage.stageWidth;
feedback.height = stage.stageHeight;
addChild(feedback);
}
private function loadSmall():void
{
// large file is 2MB - calculate how many times 79KB file needs to be loaded to
var i:int = loadCounter = numToLoad = Math.ceil(2055333 / 80798);
var loader:Loader;
smallSTime = getTimer();
while (i--)
{
loader = new Loader();
loader.contentLoaderInfo.addEventListener(Event.COMPLETE, onSmallLoad);
loader.load(new URLRequest("http://www.finecooking.com/CMS/uploadedimages/Images/Cooking/Articles/Issues_111-120/051118020-02-mi...?" + Math.random()));
}
}
private function loadLarge():void
{
var loader:Loader = new Loader();
loader.contentLoaderInfo.addEventListener(Event.COMPLETE, onLargeLoad);
largeSTime = getTimer();
loader.load(new URLRequest("http://reciperhapsody.files.wordpress.com/2010/06/watermelon-tomato-salad-5-31-10.jpg?" + Math.random()));
}
private function onLargeLoad(e:Event):void
{
largeTime = getTimer() - largeSTime;
writeResult();
}
private function onSmallLoad(e:Event):void
{
loadCounter--;
if (!loadCounter)
{
smallTime = getTimer() - smallSTime;
writeResult();
}
}
private function writeResult():void
{
feedback.text = "large file loaded in " + largeTime + " ms\n\n";
feedback.appendText(String(numToLoad) + " files loaded in " + smallTime + " ms\n\n");
feedback.appendText("small files time load difference " + int((1 - (smallTime / largeTime)) * 100) + "%");
}
}
}
Copy link to clipboard
Copied
One more thing that I, personally, always forget.
Based on your descriptions it feels like you load many MBs of data. Do you know how many MB worth of files you application loads?
Browsers have limits on how much cache is allocated to it. For example, if browser has only 100MB of cache allotted and these 100MB are used - browser will clear cache (as I understand on first-come-first-cleared basis).
This means that if the application is 100MB combined - practically it will never be cached and each time user reloads it - all files will be fetched from the server.
Copy link to clipboard
Copied
WOW - a massive thanks again.
Should be spending today optimizing certain areas of the site according to your guidelines. Can't wait to get down the school tomorrow and load the site on 25 computers at the same time. I know I won't get it perfect straight away but things should get a lot better. I especially want to lighten the home screen and menu screens as they are loaded so many times. I didn't know about the 100MB limit for browsers. If that is the case users like me must have well over that limit. If you play facebook games you can imagine how much is stuffed into the cache. However I don't seem to have my cache emptied ie: the games load quickly but I suppose that sites like farmville are a lot more optimized than my site.
In conclusion
a. I will lighten the main screens
b. I will break down the larger swfs ie: If a game uses 10 images then I will use 10 images. Down the line I will take a look at spritesheets for that although that is out my comfort range at the moment,
c. re-read theses posts to try and glean anything that may have escaped me.
Cheers
Copy link to clipboard
Copied
btw: IMPORTANT: VECTOR vs BITMAP
I am developing for multiple platforms
1. Mobile: Casual games: Kids and adults tend to carry their mobile at all times, tablets a lot less (apart from at home)
2. PC: At schools (a lot) + at home (less so but still quite a bit)
So is it fair to say that:
1. PC: Over the internet: Develop as many vector graphics as possible ie: weigh a lot less than pngs and jpegs. Take more to render via CPU but who cares on a modern pc.
ie: I could get 500kb down to 70kb if I use vectors for my menu screens etc... (in actual fact they are a mixture of vectors and pngs and the vectors weigh hardly anything.
2. Mobiles and tablets: Try and use starling for GPU and definitely use pngs and as little vector art as possible.
Copy link to clipboard
Copied
OH - and sorry to be a pain in the butt - but now that i am testing stuff for tomorrow.
Just re read a little.
An image like a background for a game do I:
a. Import it as an external swf
b. Import it as an external image
c. Keep it in the original game swg where it is required.
Same question with a sound loop (a bg music so quite long...)
btw: I have some music loops which are around 500kb. That's quite a lot ie: If I didn't have them then the swf would be half as light or more. Are there any decent music loops out there that weigh a lot less?
Copy link to clipboard
Copied
Gerry,
I feel like you are trying to find a single correct approach that will fit all. If my impression is correct - I am afraid there are unrealistic expectations in finding an authority that will provide answers leading to perfection. There are no hard and fast rules. And not everything is under your control.
Without knowing business logic, objectives, and application architecture it is impossible to define optimal approaches.
To summarize what I already stated, based on my experience and realities of Internet based applications here are some of my PERSONAL approaches (which may be sub-optimal or even totally wrong under some circumstances):
I don’t have anything to say about complex animations for my line of work does not require dealing with it. But, still, if you have an opportunity to use GPU an obvious choice would be to utilize bitmap and manipulate BitmapData.
Unfortunately, without good server side there is always a ceiling to performance. This is to your observations regarding Facebook games performance. I am definitely spoiled in a sense that I have the best CDN and Cloud capabilities at my fingertips. I don’t have to agonize over server side performance. For you it is, probably, a different story because you may not have enough financial resources to employ such monsters as AKAMAI and Amazon Cloud (although Amazon is getting more and more agreeable with small business pocket).
As far as platforms go – cross-platform uniformity and compatibility is still a dream. You will have to cater to particular platform in a large extend. It just helps to have an architecture that allows for an efficient parts replacement.
Again, all this is theoretical in my opinion. If something is decent – screw perfection! What is an optimal balance between quantity and quality in your business? How much your users are willing to wait for an experience? How badly waiting time affects quality? As soon as benchmarks hit tolerable levels – it may be a good enough excuse for moving on.
Copy link to clipboard
Copied
Thanks Andrei.
You have given me a lot of information to look through and obviously this is a quick learning curve for me. I have a lot of time at the moment so I'd like to try out all your techniques above and see which ones come out on top.
Cheers
btw: I load a food.swf with the 80 food movieclips. First of all I was going to split them into 4 - 8 smaller swfs as you said BUT my question is that you said to keep resource external.
1. They are external ie: game.swf will load them.
2. Or do you mean outside of any swf. But they are moviclips drawn inside the fla, they are not bitmaps or anything like that.
So it's fine what I'm doing right? The sounds are also in that swf although I read you said it would be better to keep them out for streaming purposes. But they are only one word mp3s.
I do like the idea however of streaming longer sentences.
You mentioned we can load off the hard disc - now that wold be perfect. You see I sell to schools so those schools could load the resources onto their hard discs and voila all the bandwidth problems would go away. If somebody wants to play at home then they could also load the resources onto to their hard disc. I would need a programme on the CD to set up the files where they are supposed to go so my flash prgm can dind them though.
Me thinks me runs the risk of trying your patience.
Copy link to clipboard
Copied
I'll stick to my guns in this application. Rather than the overhead of (as mentioned) 160 image+sound requests, for slim development budgets and as Andrei1 mentioned, limited cache, I'd still stick to a single SWF.
The benefits you get from multiple HTTP requests is concurrency on the download. This is the approach all those old "super speed downloader" apps used to take. Multiple HTTP requests at various 'resume' points in a single file knowing that a single request can be throttled. But if your network isn't top notch or distributed, this can slow you down and overwhelm you very quickly.
Here's a helpful link on some browser download concurrency per domain if you have the servers to handle the concurrency load:
http://stackoverflow.com/questions/985431/max-parallel-http-connections-in-a-browser
Do note the users mention of multiple subdomains. Often a base feature of a quality content delivery platform, your requests can be split to subdomains to multiply that concurrency, e.g. images.yourdomain.com, video.yourdomain.com, sounds.yourdomain.com.. Just make sure you can handle it.
You'll get diminishing returns on all the request overhead if the files you load aren't large enough however. 1MB would be a good limit. If you're loading sheets that large then I'd keep them external. If you're loading 80 small images, the server load, HTTP overhead and lack of throttling on an average network isn't worth it IMO.
Copy link to clipboard
Copied
Hi.
There is a a lot of information on this post now. I will try out all methods and report back. All this knowledge has me a lot to know how to do things better and how they work:
BUT
I tried out a few tests this morning and to my horror the "problems" seem to be somewhere else. So all these comments will help greatly but the problem is in our code.
TEST
I uploaded a new menu.swf that weighed only 270kb to replace the other one of 1MB+ thinking that should do the trick. To my horror it loaded as slowly (uncached as I empty the cache on firefox for testing).
It took 13 seconds - what a disaster.
So on the good side my asset loading doesn't seem to be too bad (although it can get a lot better).
Bad side, my menu screen is doing some requests to the server database with some xml in there etc... and that is where I am completey lost.
That has nothing to do with this post, should I try and create another post?
Copy link to clipboard
Copied
It doesn't have much to do with this post but it's still on the general track of helping you optimize your project.
Have you load tested your server? 270KB would load instantaneously on just about any system with decent net. A rusty old DSL connection at 1.0Mbit would load that in ~2.1 seconds at full speed and most people have been well beyond that for a decade or more. The issue must be hardware, poorly performing software or an overwhelmed network. 13 seconds is a very, very long time.
Copy link to clipboard
Copied
My internet connection
download speed: 2.6MB
upload speed: 0.27
As I say I tested 250 + 570 + 1MB+ - all at 11 seconds.
1. I have a hosting company. I am on a shared server. Is there a way of testing them or have I just done it above.
2. btw: I think the problem could be in the code ie: I am loading a menu with 20 padlocks (like angry birds). The padlocks get their names from an xml and each padlock has between 1-5 stars. The amount of stars per padlock s held in a mysql database so we query the database as well. However, somebody told me that requests to a database take less than a second anyway,
Copy link to clipboard
Copied
OK - because I like torturing myself.
Hit a record.
500kb loaded menu after 28 seconds. Now I think you know how I feel. I could also ask the hosting company if they know anything about all this and as I say my code and what exactly is going on. Where is this friggin bottleneck.
Off to bed. Wil investigate over the next two days.
Copy link to clipboard
Copied
Hi guys.
Just a quick update.
I have decided to interrogate the hosting company. I am on a shared server and so I asked if loading times would be better on a dedicated server and they said, of course. OK but please have a look at my situation and tell me if I am getting my moneys worth. I am fine about paying ten times the fee if I get ten times the results. So hopefully tomorrow I will get some good answers.
AKAMI and amazon cloud etc... Well I go on the site and understand nothing but here in Spain there is a large company called Arsys which have monthly rents of around 200-250 euros. If I see the loads at where they should be then I'm there next week.
btw: reading about CLOUD vs DEDICATED server. Mixed reviews. If I have an elearning site with 1MB here 1MB there and 100 concurrent users I need it to work fast, so I suppose I should have a dedicated server anyway. I suppose if I give them my details they can advise me which plan to take out. Cloud instead of dedicated server - is that best?
Anyway, I'll keep you guys posted as know you can't sleep at night till you find out
Cheers
Copy link to clipboard
Copied
Didn't know what a CDN was, so wikipedia...
WOW - looks good. We don't know if the problem is the network or our code yet but as I haven't got a lot of time to waste and seeing as there will be a lot of traffic next year I am going to switch hosting companies.
Now I am faced with the choice of CLOUD or CDNs which seem to be a hybrid of dedicated servers and other "things".
Andrei mentioned Amazon and Akami.
Just read about telcos advantages so that looks like a good choice too. Perhaps Telefonica Spain shouls have some good deals going and their sales support is great.
Because they own the networks over which video content is transmitted, telco CDNs have advantages over traditional CDNs.
They own the last mile and can deliver content closer to the end user because it can be cached deep in their networks.[18] This deep caching minimizes the distance that video data travels over the general Internet and delivers it more quickly and reliably.
Telco CDNs also have a built-in cost advantage since traditional CDNs must lease bandwidth from them and build the operator’s margin into their own cost structures.
Copy link to clipboard
Copied
Please note that you need to be aware of your role in any dedicated server choices. Some come with software, hardware and uptime monitoring, some come with nothing more than hardware failure monitoring. In the US it's typical that all they monitor is hardware failure. They will hand the keys of a box over to you and then it's your responsibility to make sure everything is running, including the web server, mysql server, etc etc. In that case, if your entire site goes down and you don't notice, they aren't responsible for telling you. If you don't really know how to configure server software it isn't even an option you should explore unless you pay them to maintain the software and uptime IMHO. Otherwise you can fail to patch any aspect or just set the servers up insecurely and get exploited.
Think into the future (500+ concurrent users, 1000+, etc) and plan for that. Try to find a host that can scale with you with minimal cost. Often they charge even enormous prices for what you'd consider dirt cheap, like $500 for a 1TB HD you could get for $50. You might save yourself a lot of money working with a local host that will take a server you buy yourself to avoid their extreme costs and save monthly on the rented hardware but then you're responsible for any failures out of your own pocket. It still works to your advantage.
100 concurrent user traffic is very minimal to a server, but the site needs to be developed properly. If you're running 90 logic based join queries without proper indexing on any database per user your site will slow down to a crawl. Otherwise a basic modern quad xeon with 32GB RAM and a RAID 1+0 will easily serve 1000s of people without breaking a sweat. You can start with as little as a hex core AMD, 16GB RAM, single enterprise HD and easily serve hundreds.
CDNs are great but their cache can be a disadvantage as well. A CDN is nothing more than globally distributed serving. The user gets served from the geographically closest "mirror" of your site. However keep the "mirror" in mind. Sometimes CDNs have slow mirror schedules. Changes you do sometimes aren't reflected on the mirrors for hours or even a day. You shouldn't be rolling out rapidly anyhow, unless it's a vital fix. In that scenario you're typically required to have a local sandbox to develop on (you should have one anyhow).
Either dedicated or CDN starts to get into the "limited" bandwidth/space category. When you leave the cheaper shared hosting you really need to understand how much bandwidth will cost you. That's when the question of 512KB nav becomes how can you reduce it to 52KB ;P.
The DB query can take less than a second but you can easily verify this by making the request yourself without flash. You should already have a sandbox script that tests the query outside flash from when you were developing. See how fast it's returned. On even the worst shared hosts I've used a single query that joined no more than 10 tables with no stored proceedures and basic indexing only took milliseconds.
Copy link to clipboard
Copied
TEST: Menu screen swf (no db calls) Just uploaded
Size: 1,2MB
Load speed: 4-5 seconds
Yesterday TEST: Menu screen with php sql calls to db
Size: 250kb then 520 kb
Load speed: Differed from 13 - 30 seconds (the latter late at night)
So this shows there is a massive lag with the database calls. However, when cached the screens load in 1 second. How is db info cached? Perhaps in a shared object? ie: Even though the page was cached the db query results aren't cached are they?
So as you say, it would be a case of carrying out the queries outside of flash. BUT I don't know how to do that and I don't quite know what a sandbox is - will read up now. Have yougot any code snippets or a web page to read on that, please?
Thank you SInious
Excellent info as always. I think you're right with regards to staying with my local hosting company.
1. There are only 2 technicians who know my name and I can see physically whenever I want although I have to pay which I can understand.
2. They are php + mysql + everything internet pros (well compared to me, just imagine) I think I can get them to have a look at the php sql query code amd get it optimized a least.
3. I should start with them with a dedicated server perhaps and let's see where it goes from there till I have more experience and cash to take it further.
4. I don't mind paying more up front just to get things set up properly and then scaling should be easier. They charge 30 euros an hour which is very good can optimize my stuff for their servers or in general.
Copy link to clipboard
Copied
LAST TEST + definitely the php query code.
I stripped out all graphics and sounds ie: "naked." out o the menu screem and now took 30 seconds to load. WOW - now I have to spend my time checking out the php code.
Copy link to clipboard
Copied
I had little doubt the hardware was stressed. Now you'd need to figure out what part.
If you have a typical web cpanel there should be a phpmyadmin in there somewhere to manage your database. Take a second to jump in there and use the tools available to analyze the servers performance. Once inside and your DB selected hit the Status tab and then sift through the various sub tabs. Each tab gives you valuable information like Server, Monitor, Query Statistics, Advisor, etc.
My site on a shared cloud is running pretty much max users (meaning adding more would affect overall performance):
If you're NOT using a shared host you definitely don't want to see your CPU peg out (1.0) like that, nor your memory and traffic spike to max. In my case, I get what I pay for, cheap unlimited cloud hosting.
As for your DB, take a moment to make sure your tables are indexed properly and it can even help to go around and run the maintenance tools on each table. They get crufty after a while:
Analyze, Repair and Optimize for life
Find more inspiration, events, and resources on the new Adobe Community
Explore Now