Skip to main content
Inspiring
August 5, 2011
Open for Voting

P: Add 10-bit/channel color display support

  • August 5, 2011
  • 67 replies
  • 4970 views

Lightroom: Add 10-bit support for end-to-end 10-bit workflow. That is, Application->Driver->Graphics Card->Monitor for true 10-bit color. At the moment, all that's missing is the application. CS5 supports 10-bit color, why not Lightroom?

67 replies

proligde
Participant
April 13, 2017
After reading this thread I summarize it as:

Even when having a pro 10bit capable display AND a pro 10bit graphics card like NVIDIA quattro or ATI FirePro you can only use 8bit color depth in lightroom. At least when you're on windows.

Is that correct? And if yes - WHY? I know I might not be able to tell the difference between two nearly identical colors, no matter whether I have 8bit or 10bit per channel BUT I indeed see the difference when having a highly increasing dynamic range. Compare a 256 shades (8bit) greyscale to a 1024 shades (10bit) greyscale - that's a massive (visible) difference.
Inspiring
January 5, 2017
There is no 12bit panel. Some TV may be able to accept 12bit data, but displaying 12bit color is another story.
Inspiring
January 5, 2017
Add please 10 and 12 bit  per channel output to driver.
My monitor can handle 12 bit per channel and only Photoshop can use it within 30 bit option.
Inspiring
December 22, 2016


Application is the only thing missing in true 10-bit workflow now.
Inspiring
December 12, 2016
Thanks for the info. It's a shame that NVidia refuses to support 10 bit OpenGL on mainstream cards. I got a feeling that INTEL, MS, NVidia, ATI are the ones who stop the PC from progress. 
Inspiring
December 9, 2016
Nope. You do need a Quadro.
You can enable the color depth option in your GTX670, but you will still see banding on your 10 bit capable monitor connected with Displayport  in Photoshop (check it with "10bit test ramp" file).
Geforce cards can output 10 bit color but they only support that with applications that use Direct X. Quadro cards support 10 bit color via Open GL. Photoshop uses Open GL so you need a Quadro card for 10 bit with Photoshop.
More info here:
http://nvidia.custhelp.com/app/answers/detail/a_id/3011/~/10-bit-per-color-support-on-nvidia-geforce...
Inspiring
December 9, 2016
No need Quadro anymore. Mine is an older GTX670, and it supports 10bit now. Go to NVidia Control Panel and there is an option to select color depth. You need a 10 bit capable monitor and most importantly connect with Displayport cable.
Inspiring
September 7, 2016
With current version of LR, no chance of having 10 bit display. With Photoshop, a Nvidia Quadro  K1200 is more than enough. Only workstation cards like Quadro enable 10 bit/channel through OpenGL (that is what Photoshop supports). 10 bit display on LR would be very welcome, they lag considerably behind Photoshop team.
PhilBurton
Inspiring
May 12, 2016
So if I have LR 6.5.1 on Windows, will I get 10-bit display output if I have the right graphics card?  If yes, can anyone recommend a suitable card.  I don't want to spend "too little" but I don't need the crazy-fast performance that gamers crave.

Thanks.
ssprengel
Inspiring
May 10, 2016
Ok that's good.  I interpreted Simon's statement:  "But the result image data of the final monitor color transform (for display) is currently limited to 8-bit" incorrectly to mean that the monitor profile conversion happened in 8-bit.