Copy link to clipboard
Copied
If my After Effects project is set to 8-bit but I choose 16-bit in the render settings, will that result in an actual increase in color depth, or does it simply render the existing 8-bit data in a 16-bit format? Will it still utilize the original 16-bit data from the footage and effects during rendering?
Copy link to clipboard
Copied
Blurs, blend modes, 16-bit effects, color corrections, anti-aliasing, and everything else get a boost from 256 colors per channel (16.7 million possible colors) and only 256 different levels for an alpha channel to 281 trillion colors. That's a massive difference in the quality of everything from gradients to motion blur. Even if you start with 8-bit images or footage, the final render will be significantly better if you add any effect or animation. I only create about one 8-bit comp for every hundred created at 16 or 32-bit. Almost all of my production master renders also have a higher bit depth. I only render 8-bit H.264 files as quick tests for client approval. All delivered production masters are at least 10-bit in color, and most are higher.
Copy link to clipboard
Copied
Thank you for you answer. If I set my project at 8bit (lower part of the project panel), and set it to 16bit in my render settigs, will it render at 16bit in a good way or it's messed up because the project is 8bit and the render settings is 16bit?
Copy link to clipboard
Copied
Your comp needs to be set to 16 bit.
Find more inspiration, events, and resources on the new Adobe Community
Explore Now