Copy link to clipboard
Copied
Hello,
I’m new to After Effects plug-in development.
In my plugin, I need to perform strict RGB value comparisons in an 8-bit environment, so I also want AE to automatically convert input to 8-bit when the project is set to 16-bit or 32-bit color depth.
However, I’ve noticed that when AE automatically converts 16-bit or 32-bit images down to 8-bit for an effect that only supports 8-bit, it introduces subtle noise or dithering artifacts.
I tested several of AE’s built-in “8-bit only” effects and saw the same issue — the colors contain subtle noise, and sometimes the noise even evolves, like in this GIF.
My plugin needs to calculate and output precise HEX color values, and several internal steps depend on exact 8-bit operations and strict RGB comparisons.
So my question is: is there any way to avoid these noise artifacts during AE’s automatic 16/32-to-8-bit conversion?
hmmm... i've never encountered that problem. you could report it as a bug.
i would suggest you let your plug-in accept 16 and 32 bpc inputs as well, and do the conversion to 8 bit yourself.
the users will benefit from having the output be back in 16/32 bpc, and you'll benefit from not having the warning sign next to your plugin...
Copy link to clipboard
Copied
hmmm... i've never encountered that problem. you could report it as a bug.
i would suggest you let your plug-in accept 16 and 32 bpc inputs as well, and do the conversion to 8 bit yourself.
the users will benefit from having the output be back in 16/32 bpc, and you'll benefit from not having the warning sign next to your plugin...
Find more inspiration, events, and resources on the new Adobe Community
Explore Now