Skip to main content
Participating Frequently
February 2, 2012
Question

pixel bender in flash - CPU v. GPU?

  • February 2, 2012
  • 2 replies
  • 3665 views

i'm trying to figure out how much processing power a pixel bender filter with dynamic values (modified via flash) requires from my computer. i understand pixel bender should run on GPU and my video card (NVIDIA GeForce GT 120 on a mac) is supported.

looking at this flash video implementation http://www.brooksandrus.com/blog/2009/01/19/pixel-bender-effects-video-killer-runtime-effects/ i get really high CPU usage on the mac activity monitor as soon as i apply the filter. i tried using google chrome's task manager and resources seem to be drawn as badly. i tried several implementations of other filters myself, and no one had minimal CPU usage.

could anyone shed a light here? is pixel bender in flash supposed to suck so much CPU power while it's meant to be using GPU (if it's not using it already)? or maybe a 'better' way to check what's happening? funny thing is that on my tests even if CPU usage was high, the effect was still going fairly smoothly.

many thanks

This topic has been closed for replies.

2 replies

December 11, 2013

Pixel Bender just tricked me making me all that happy and down after all.

GPU functions in the debugging environment of the desktop application of FlashBuilder, and it does not function in a release environment.

OMG...

Thank you for the temporary and short-lived happiness.

I hope it will be handled properly for the better in near future.

Participant
December 11, 2013

From what I understand pixel bender support has been removed from flash 11.7 onwards or some one from adobe can correct me if I am wrong.

Participating Frequently
December 14, 2012

Somewhat disappointingly Pixelbender still does not (as far as I know) execute on the GPU, when running in Flash. Which is really quite strange when you consider that Flash can do GPU 3D animation. For some reason a 2D shader remains in the too hard basket.

But reading the marketing on Pixelbender I'm not surprised you would assume Flash ran it on the GPU. It's not until you've invested time in it that you discover it doesn't. You naturally think that Flash + Pixelbender = GPU. And nobody tells you otherwise so it's easy to jump to that conclusion. You go back over what you've read and see how they've done it. Clever use of words.

That said there are some slight benefits over Actionscript, to do with threading and background processing. Will be awesome when Flash eventually can run Pixelbender scripts on the GPU.

In Photoshop, however, Pixelbender does run on the GPU.

Carl

sinious
Legend
December 14, 2012

As a 2-facit reminder, please remember a GPU is not a CPU. They are designed for different purposes. GPU is specialized, CPU is general. While in the future filters may be coded to somehow use the GPUs shaders to accomplish things, testing first is the best solution. I wouldn't expect PixelBender to instantly update to 100% GPU usage. In fact, I'd never assume anything until tested.

Participating Frequently
December 14, 2012

By definition a GPU shader is a shader that runs on the GPU. Or to put it another way, a GPU shader that doesn't run on a GPU is not a GPU shader.

The design of the PixelBender language is one that is obviously designed for compilation into GPU instructions. It is based on OpenGL shader language (etc). It's not based on this language because somehow it was felt this language would be a good language in which to bend pixels. It's a lot easier to bend pixels  in Actionscript, or Java, or C. That said there is some slight performance improvement going through PixelBender. The same constraints that make it easier to compile Pixelbender to GPU instructions also help it to be compiled more efficiently for the CPU. That's certainly a plus. 

Carl