Copy link to clipboard
Copied
Hi everyone. I have some 16 bit RGB photos with evident banding in gradients. The target is converting them into 8 bit CMYK images.
I'd do RGB16bit > CMYK16bit > CMYK8bit.
Since banding is visible yet in RGB, I'd place some noise (around 0,5 to 0,8) before converting to 8 bit.
So, remaining question is: adding noise in RGB16bit or in CMYK16bit?
(Perhaps, intuitively, I'd say in CMYK16bit since CMYK has less colors than RGB, and therefore banding could be increased. Is that right?)
Thanks in advance.
Copy link to clipboard
Copied
Hold on a bit.
If you're working with 16 bit data, any banding you see is in your display system. Remember that the display pipeline from video card to monitor operates at 8 bit depth. That's where the banding happens!
Furthermore, this banding is cumulative. Calibration tables in the video card, a suboptimal monitor profile, the panel itself (many of which are actually 6 bit + dithering) - they all go on top of the basic 8 bit banding. The result can be highly irregular and colored banding. But it's all in the display system, not in the data.
Obviously, converting to 8 bit depth for output will reintroduce banding - but that's not what you're seeing now.
If this is a major concern, you may want to invest in a 10 bit capable monitor and video card.
Copy link to clipboard
Copied
".. since CMYK has less colors than RGB, and therefore banding could be increased. Is that right?"
Yes and no. The banding you'd see in 8-bit RGB is because each channel has only 256 levels. CMYK has 4 channels of 256 levels, so you can potentionally see less banding even with less gamut colours.
So, In your workflow, converting directly from RGB16 to CMYK8 will yield best results... Add noise as musch or as little as you think.
In most cases, if you are printing CMYK on paper, even the best coated paper has irregularities that helps to hide banding.