Skip to main content
Inspiring
August 5, 2011
Open for Voting

P: Add 10-bit/channel color display support

  • August 5, 2011
  • 67 replies
  • 4970 views

Lightroom: Add 10-bit support for end-to-end 10-bit workflow. That is, Application->Driver->Graphics Card->Monitor for true 10-bit color. At the moment, all that's missing is the application. CS5 supports 10-bit color, why not Lightroom?

67 replies

athelas64
Known Participant
March 4, 2019
First of all, glad the conversation on this topic is going on. 10bit display is deeply in my heart and I would love to see it just as you do - if not more, cause I have a 10bit workflow on my desk for over three years now.

@1567043> I was referring to the printer because making prints significantly improves your photography. Nobody prints every picture they take. But spending time on finessing this very last step (OK, the very last step is framing and display) will get you thinking about your picture a lot more.

Based on my experience (and the experience of my friends who are gallery specialists doing scans of artworks priced at 6+ figures and saving them in 16bit uncompressed TIFF files), if I put it in laymen terms: For general photography, 10bit output is a p0rn only YOUR MIND can appreciate. With the LUT tables, high precision internal processing and properly calibrated screen, the difference between FRC dither and true 10bit output is indiscernible to your vision. Yes, you can tell in artificial monochromatic gradients. And (maybe) yes a highly skilled radiologist would be able tell the monitors apart. But for the rest of the cases...

@1654796> You are misinterpreting the facts. Yes, all of those manufacturers use 8bit LCD panels. But, for their premium priced prosumer and higher lines of products they also have 10bit LCD panels with the micro controller really sending the 10bit value to the subpixel and really being able to display all the discreet levels of brightness in each of the subpixels does not requiring any dithering etc.

Actually, going 10bit is NOT that hard. What is hard (expensive) is achieving more-less linear wide gamut in a useful spectrum of colors. What is even harder and even more expensive is achieving uniform back-lighting and acceptable black levels on LCD screens. But without these, 10bit would make even less sense than it does now.
Todd Shaner
Legend
March 3, 2019
Very few manufacturers promote true 10 but monitors. Mostly Eizo, NEC and Benq.
All of these manufacturers use 8 bit/color LCD screen panels with FRC dithering to achieve 10 bit/color and a 16 bit/color internal LUT for calibration. This sets them apart from lesser displays that need to be calibrated using the monitor's external controls and profiled  using the 8 bit/color LUT in the graphics card. This reduces the displays effective bit depth.

More here on the subject:

https://forums.adobe.com/message/10958873#10958873



photo7734
Known Participant
March 3, 2019
I was referencing Skippi's hierarchy of importance where he saw purchasing a custom calibrated printer as something that should be done prior to purchasing a 10-bit color capable wide gamut monitor. That would only hold if you printed every photo you edited. For most, the monitor would be used for editing a much larger number of photos than what would be printed, it seems to me. Also, it is not unusual to farm out the printing to a specialist.

There's no doubt that the vast majority of monitors in use are not 10-bit color wide gamut. But they are considered the best for a variety of reasons, and I am convinced of those reasons. It seems obvious, if Adobe sees itself as the best editing software, it needs to support the best monitors. I am not a programmer, but I assume supporting 10-bit color monitors would not prevent folks with standard monitors from using the programs.
Participant
March 3, 2019
Very few manufacturers promote true 10 but monitors. Mostly Eizo, NEC and Benq. I don't understand what you mean by not putting a printer in front of 10 bit colour.
photo7734
Known Participant
March 3, 2019
I agree that a wide gamut monitor is important, but as near as I can tell they all promote 10-bit color as an integral part of their improved display. The number of photos I would edit would be much greater that the number I would print. I would not put a printer in front of 10-bit color.
athelas64
Known Participant
February 11, 2019
I think reason one is only a teeny-tiny portion of LR users is actually capable of running 10bit display workflow.
Reason two: they did not really make it work even in Photoshop yet (it works at particular zoom levels without any tools, including selection / marque tool, being used).
Reason three: your human vision: on a photograph it would be a challenge to see difference from a viewing distance between dither and true 10bit - UNLESS in a specific use case scenario (black and white photography, monochromatic photography of whatever color).
Reason four: as it was mentioned, internally all the calculations are 16/32 bit so the display depth is not a limit
Reason five: possible performance taxation on GPU acceleration. Essentially: GPU acceleration would not be able to share the same code for Radeon/GeForce with FirePro/Quadro cards. 

And at last but not at least: you would have to hire extra support staff just to trouble-shoot people's buggy 10bit workflows over which you have very little control but in the end the trouble will land in Lightroom.

Do I want 10bit in Lightroom? Hell yes!
Do I need it? Hm....
Would I will be able to see the difference on my pictures? Oh, I wish...
Are there any other more important issues I would benefit from more if Lightroom got them addressed? DON'T GET ME STARTED.... 😄

Get a high quality wide-gamut display with hardware calibration and with guaranteed uniform back-light (SpectraView or equivalent).
Make sure you have means of monthly HW calibration.
Secure a high-quality photo-printing workflow with custom calibrated printer for all the media you plan to use.
Now, you have done a lot for a high quality output. 99% percent.
You want that extra one percent, that is your 10bit display in Lightroom (and believe me, there are way better apps with much deeper control to print from than Lightroom).
Participant
February 9, 2019
What would be interesting to know is WHY Adobe has limited Lightroom in this way. There must be a justifiable reason otherwise I am sure 10 bit would have been enabled. However, it is definitely a desirable attribute and I would add my voice to the request for a full 10 bit monitor output.
Inspiring
October 29, 2018
Sad, indeed. I'm looking to move away from LR/Adobe since that Lightroom Classic, Lightroom CC, Lightroom bitchmove, anyway. 
athelas64
Known Participant
October 28, 2018
When Adobe runs out of all the ideas for LR and when the whole and often misunderstood HDR hype takes over the world, then and only then this now "nobody needs this" feature will become "essential must have". Many modern image formats, including HEIF do have, unlike JPEG, more than 8bit per channel support. In the situation where you phone can display it but your mega rig with latest Lightroom can not, we might see some change.
Than being said, my latest Win10 with Quadro Card and 10bit display just would not get 10bit through in any application no matter what. It worked before, but not anymore.  
Inspiring
December 21, 2017

I'm in the process of considering upgrading my computer/monitor(s)/printer to enable full 30-bit workflow. I am currently using a PC with Windows 7 SP1, Radeon R9 390 graphics card and two 8-bit (6+2 in reality) 23" Dell monitors as well as Lightroom 6. I am relatively new to Lightroom 6 and still learning, but really like it so far - very easy and intuitive software. Having just spent $150 on my single, perpetual license I am reluctant to change to new photo editing software again.

So, it is clear that Lightroom 6 will not support 10-bit display output with Windows, not even the latest standalone version 6.14. However, after reading this thread I am left with the impression (rightly or wrongly) that Lightroom 6 will support 10-bit final monitor color transform (for display) for a Mac using 4k or 5k 10-bit display with latest OS. Is this impression correct?

What I am really trying to understand here is if there is a distinct advantage to changing to a Mac with 4k-5k 10-bit display as opposed to upgrading my existing Windows based system? Will I unlock potential for a true 30-bit end to end workflow with Lightroom 6 or do I need to change to Photoshop even with a Mac?

Looking at this from a different angle, will I be able to perceive differences in photos generated by Lightroom 6 with a true end-to-end 30-bit system compared to a 24-bit system? How much will this in practice affect color range (i.e., ability to utilize and display sRGB vs aRGB spaces), eliminating banding in softly graduated tones, etc. Put another way does Lightroom 6 already somehow compensate for these issues in the event that it is in fact restricted to 8-bit output even with Mac?