Skip to main content
Inspiring
August 5, 2011
Open for Voting

P: Add 10-bit/channel color display support

  • August 5, 2011
  • 67 replies
  • 4971 views

Lightroom: Add 10-bit support for end-to-end 10-bit workflow. That is, Application->Driver->Graphics Card->Monitor for true 10-bit color. At the moment, all that's missing is the application. CS5 supports 10-bit color, why not Lightroom?

67 replies

Adobe Employee
May 9, 2016
Steve is correct.

On Mac: Lightroom has always uses the 16-bit math throughout in the develop module (with ProPhotoRGB as the internal working color space). With the GPU acceleration, it requires at least Lr 6.2 and later.

On Windows: It also uses the same 16-bit ProPhotoRGB math for internal image renders. But the result image data of the final monitor color transform (for display) is currently limited to 8-bit.
ssprengel
Inspiring
May 8, 2016
I'm pretty sure that LR uses a 16-bit internal workspace, while the display is only 8-bits, so the term 8-bit-processing isn't quite right. 
PhilBurton
Inspiring
May 8, 2016
My Nikon D3 produces RAWs with 14-bit color depth.  With 8-bit processing in LR, I'm throwing away 6 bits of color depth smoothness and gradation.  As others have pointed out, Photoshop has 10-bit color support, so it's not something that Adobe can't do. And nowadays, both Windows and OS X support 10-bit color.
Inspiring
April 9, 2016
Having just tested 10 bit support in video media with a good - but not ridiculous - rig and seen the difference it makes I am very keen to get 10 bit support on ALL my photo apps.

windows 10 - nvidia 980ti - Dell U2713H. I know this monitor is doing slightly fake 10  bit (as many laptops do fake 8 bit), but the improvement is significant.
January 13, 2016


we, photographers, need Lightroom to support 10-bit display output for having our advance displays like models from Eizo or NEC producing 10-bit billion color depth
Inspiring
January 13, 2016
I am a very satisfied Lightroom user, but the lack of 30 bit color display capability it's a major shortcoming for a pro grade software.
I hope this feature will be added soon.
Inspiring
May 22, 2014


I fully understand that you can't give Lightroom too many of the attributes that Photoshop has but a 10 Bit Monitor is just superb [using PSCC] and dare I say the defacto standard for sill digital image editing. It would be really nice to see Lightroom 6 having the same facility. It would also give the two Applications greater synergy when viewing/editing.
Known Participant
June 7, 2012
Thanks... yes, you already mentioned that here, too: http://feedback.photoshop.com/photosh... ... But I seem to have overlooked it, or simply forgotten ;-)

So... is it safe to assume that because of this, LR does not really need 10 bit output? I mean, if the dithering works so well on artificial images, it should work nearly perfectly on most real world photos. The only case when 10 bit would have an advantage is when there are sublte tonal *patterns* (e.g. fabrics, etc.) with small size (some pixels), which cannot be shown using dithering. But I doubt that such patterns can really be seen even with 10 bit.
dorin_nicolaescu
Inspiring
June 7, 2012
Yes, Develop and Export (to 8 bpp) dithers.
http://forums.adobe.com/message/3655226
Known Participant
June 7, 2012
Additional question: Does LR use dithering in the develop module? I just loaded the 10 bit test ramp from http://www.imagescience.com.au/kb/que... into Photoshop and LR (8 bit output). In Photoshop, I see the 8-bit-banding, as well as in LR loupe (because previews are 8 bit JPEGs). But in LR develop, there is no visible banding!

This would be actually great, because it is the next best thing to 10 it support.

Edit: Yes, LR develop really seems to use dithering. Here is a small portion of the ramp, with extremly amplified contrast (screenshot from develop module, contrast amplified in PS):