I've been profiling a flash project and have noticed that "decompressing images" increases as the flash player window gets bigger.
The projects stage is 1920x1080 and it contains a sequence of full screen PNG images (1920x1080) graphics that are embedded into the project. I've set the compression on each image to "lossless" and on publishing set the compression to disabled.
I don't really understand what is being decompressed when the swf runs or why the window size would impact the speed it decompresses.
Does anyone know why "decompressing images" increases when the window size increases?
While IE window was full screen(1920x1080)
While IE window was small (approximately 400x800)
Intuitively, the bigger the window is, the more pixels have to be drawn. The more pixels you need to process, the longer it would take. That said, it seems like a nonlinear increase in CPU time in this instance. I don't have enough information to give you a definitive answer.
My guess is that Scout is probably being a little sloppy about how it buckets this metric. If you're compiling PNGs into the SWF (instead of loading them in from the network), they probably get compressed when the SWF is packaged. If you were loading the PNGs over the network, I imagine that you'd take the decompression hit once, we'd cache it, and then you'd just get it for free. (I actually don't understand why this wouldn't happen with them baked into the SWF, but I don't know anything about your application.)
There's a lot of good writing on optimizing Flash content for game development. That's not my personal area of focus, so others are probably better sources for resources on that front. It's probably also worth mentioning that there are good 2D and 3D frameworks for GPU-accelerated rendering using Flash. If you're doing everything on the CPU at the moment and you're not getting adequate performance, that might be an area of interest.