Question
16-bit v 8-bit when sharpening images?
Which would offer optimal quality -
1) sharpen a 16-bit flattened file and then convert it to 8-bit or
2) convert a 16-bit flattened file to 8-bit and then sharpen?
In theory, which way should yield the better quality? (Even if in the real world it may not be possible to see a difference).
Thanks,
Bob
