Skip to main content
Richard Rosenman
Inspiring
March 25, 2019
Question

PF_PixelFormat Help

  • March 25, 2019
  • 2 replies
  • 2024 views

Hi gang;

I believe I am using the PF_Pixelformat function correctly, yet I am getting incorrect results.

My plugin should use 8 and 16 bits per channel color depths only.

So in the GlobalSetup I use the flag:

PF_OutFlag_DEEP_COLOR_AWARE;

Then, in my render function I declare the following:

AEGP_SuiteHandler suites(in_data->pica_basicP);

PF_WorldSuite2      *wsP  = NULL;

PF_PixelFormat      format = PF_PixelFormat_INVALID;

ERR(suites.Pica()->AcquireSuite(kPFWorldSuite, kPFWorldSuiteVersion2, (const void**)&wsP));

ERR(wsP->PF_GetPixelFormat(output, &format));

And finally, I can check my color depth mode by using:

if (format == PF_PixelFormat_ARGB32) // 8 bit

if (format == PF_PixelFormat_ARGB64) // 16 bit

if (format == PF_PixelFormat_ARGB128) // 32 bit (I am not using this color depth)

The problem is that when I test this in After Effects, it does not differentiate between 16 bit and 32 bit color mode. It tries to the generate the effect even if I switch to 32 bit color mode, and then crashes. Between 8 and 16, it certainly differentiates.

I should mention I am checking out another layer with this plugin but I don't think that could be the issue. What could I be doing wrong here?

Thanks in advance,

-Rich

This topic has been closed for replies.

2 replies

March 25, 2019

James is right, since you claim to receive layer pixels also in 32bit mode, you are most likely developing a SmartFX plugin. A SmartFX plugin (designated by you setting the PF_OutFlag2_SUPPORTS_SMART_RENDER flag) will always have to support 32bpc!

Here is the matrix as far as I know:

1. PF_OutFlag2_SUPPORTS_SMART_RENDER is not set:

- and PF_OutFlag_DEEP_COLOR_AWARE is not set: you have to implement 8bpc in PF_Cmd_RENDER

- and PF_OutFlag_DEEP_COLOR_AWARE is set: you have to implement 8bpc and 16bpc in PF_Cmd_RENDER

2. PF_OutFlag2_SUPPORTS_SMART_RENDER is set:

- and PF_OutFlag_DEEP_COLOR_AWARE is not set: you have to implement 8bpc and 32bpc in PF_Cmd_SMART_RENDER

- and PF_OutFlag_DEEP_COLOR_AWARE is set: you have to implement 8bpc, 16bpc and 32bpc in PF_Cmd_SMART_RENDER

For efficency reasons, it is strongly recommended to support the newer SmartFX API and support 32bpc!

Richard Rosenman
Inspiring
March 25, 2019

Hi James and Toby;

Thanks for your replies.

Nope, I'm not interested in making a SmartFX plugin. As a matter of fact, I'm not interested in 32 bit at all, just 8 and 16 but t looks like it cannot differentiate between 16 and 32. This means if I set it to 32 bit, it still understands it as 16 (for some reason) and still performs the effect (which it shouldn't). I am not using the smart render flag you mentioned above anywhere, Toby.

So even though I am performing my effects only under the following conditions:

if (format == PF_PixelFormat_ARGB32) // 8 bit

if (format == PF_PixelFormat_ARGB64) // 16 bit

It still performs under 32 bit and since it's not made for that, it crashes.

Perhaps I should be using a different method to check color depth?

For instance, in this thread: Re: Need help processing raw image data They recommended using extra->input->bitdepth; which seems to return 8,16 or 32.

Any thoughts?

Thanks,

-Rich

March 25, 2019

Even if you don't have a SmartFX plugin, it can and will of course be usable/applied also in 32bpc mode in AE.

So putting a non-SmartFX plugin on a layer in 32bpc mode will (as far as I know) cause AE to convert the layer pixel data from 32bpc to 16bpc (if deep-aware) or 8bpc (if not deep-aware) and then send that to your plugin.

What is the specific reason for only supporting the older interface?

You could easily make a SmartFX plugin and convert incoming 8bpc or 32bpc pixels into 16bpc and then apply your 16bpc algorithm on that, if you want.

James Whiffin
Braniac
March 25, 2019

Hi Richard

One thing to check, are you using smartFX? If not, your effect can't be 32bits, just 8 (or 16 if PF_OutFlag_DEEP_COLOR_AWARE is set like you have).