Skip to main content
photo7734
Known Participant
March 2, 2019
Question

Does LR Classic 8.2 support 10-bit color output or otherwise provide for use of a wide gamut monitor?

  • March 2, 2019
  • 4 replies
  • 8606 views

It appears to me that wide gamut monitors require the use of 10-bit color. An answer to a question as to an earlier version of Lightroom says that it does not support the use of 10-bit color. Has LR Classic 8.2 been updated to support the use of 10 bit color? Am I wrong in thinking that 10-bit color is necessary for display of wide gamut color space? What is the hardware recommendation for use of a wide gamut monitor in Lightrooom?

4 replies

TheDigitalDog
Inspiring
March 3, 2019

photo7734  wrote

It appears to me that wide gamut monitors require the use of 10-bit color. An answer to a question as to an earlier version of Lightroom says that it does not support the use of 10-bit color. Has LR Classic 8.2 been updated to support the use of 10 bit color? Am I wrong in thinking that 10-bit color is necessary for display of wide gamut color space? What is the hardware recommendation for use of a wide gamut monitor in Lightrooom?

There are two areas of bit depth here; the document's encoding (bit depth) as well as the processing of that data OR the bit depth of the display system which is totally independent of the first (document/processing).

LR can deal with high bit data and processes that in high bit. It isn't necessarily 16-bit or 12, 10 etc, but high bit depending on the data for one and the internal processing (probably much like Photoshop, it's actually 15+1 bits used for processing).

For video to to the display, you need a fully high bit path for this to be truly high bit. Otherwise there's a conversion from high bit to 8-bit per color and back again; not ideal. For a fully high bit video path, the display, the video card, the application and the OS all must support this high bit path. This is absolutely the case in Photoshop on both Mac and Windows with modern OS's. Your video card and display are iffy. You'll have to check.

ONLY Develop module would support high bit previews. It should do so if everything else above is in line. You can test this by downloading and unzipping the document and it MUST be viewed at 1:1 (100%). It should show NO banding whatsoever. If so, it's a high bit video path:

http://digitaldog.net/files/10-bit-test-ramp.zip

Gamut and bit depth are mostly unrelated. Gamut defines the range of colors that may be possible when contained in a color space.

Bit depth is the math, the encoding used to divide up numbers of possible colors. This PDF outlines the vast differences between color, color gamut and bit depth:

http://digitaldog.net/files/ColorNumbersColorGamut.pdf

Author “Color Management for Photographers" & "Photoshop CC Color Management/pluralsight"
Known Participant
October 14, 2019
I'm stepping into that discussion as I wish to know how I should connect my old Nec PA 271W to a new i7 Mini 2018, in order to get the most, in terms of gamut and bit depth. The choice is between a Displayport to Thunderbolt 3 cable, or a DVI to Thunderbolt 3. Will I get the same with either connection?
Community Expert
March 2, 2019

Another thing that most 10-bit capable monitors do that actually does help with Lightroom display, even though Lightroom does not support 10-bits directly (but of course as already noted supports wide gamut just fine) is that it generally can use a hardware internal LUT table in the monitor itself to exactly reproduce the desired gamma curve. This will minimize banding because of 8-bit math for gamma correction in the graphics card. This makes a 10-bit monitor worthwhile for Lightroom even without direct support. Do note that to get this advantage you absolutely have to calibrate it using compatible calibration hardware that can write the hardware LUT to the monitor.

D Fosse
Community Expert
Community Expert
March 2, 2019

Yes, absolutely. This is exactly what I meant to imply by "other factors are more important".

If you're considering the CG277, a large part of what sets it apart is the integrated ColorNavigator software, which communicates directly with the unit's onboard circuitry. Don't even think about using any other calibrator. Banding is simply not an issue with these monitors, even in an 8 bit pipeline.

In any case, Lightroom doesn't really need 10 bit support. On-screen banding is rarely visible with photographs as long as you have a fairly decent display system - photographs always contain just enough noise to break up any 8 bit banding. It's more visible with synthetic gradients in Photoshop.

photo7734
photo7734Author
Known Participant
March 3, 2019

I am not truly considering purchasing the CG277, it's outside of my price range. The language I quoted was a stepping off point for my search for photo editing monitor generally. I am convinced that a wide gamut display is essential, and I have been considering a variety of sizes and resolutions. I was starting to settle on a BenQ SW271.

This discussion grew out of my search for a GPU that meets the Lightroom recommendation of 4GB VRAM for a 4K monitor and would provide 10-bit output that wide gamut monitors promote. Photoshop will support 10-bit color and I was working through some cognitive dissonance that LR would operate in the Prophoto RGB color space in the develop module but not provide 10-bit color display. I am struggling to accept the helpful and gracious advice I am getting here, and I am telling myself that I must accept the idea that 8-bit color will still afford me a 99% Adobe RGB color space.

The question now seems to be if I choose to edit in PS from LR can I expect the GPU to display 8-bit color in LR and 10-bit in PS? I expect my workflow to start in LR. If I choose to edit in PS is there an automatic transition to 10 bit in PS and then back to 8 bit go back to LR?

Thank you all. I welcome your thoughts and advice.

D Fosse
Community Expert
Community Expert
March 2, 2019

Yes, you're reading it wrong. Bit depth and gamut are different and unrelated things. You can have one and not the other, both ways.

With a standard 8 bit display system, you have 256 discrete steps per channel. With 10 bit, you have 1024. Whether your monitor is wide or standard gamut, the full range of colors will be represented in that number of discrete steps.

The advantage of 10 bit color is less visible banding. That's all.

Theoretically, a wider color space increases the distance between those fixed steps and could increase the risk of visible banding. In practice, this isn't significant for monitors. Other factors are much more important. But as a color space gets really big, like ProPhoto, it becomes a consideration.

dj_paige
Legend
March 2, 2019

No version of Lightroom supports 10 bit color.

You can use a wide gamut monitor with LR, it will show you a wider selection of colors than a normal monitor.

photo7734
photo7734Author
Known Participant
March 2, 2019

Here is a quote from Eizo in regards to its CG277 monitor:

"Using the DisplayPort input, the monitor offers 10-bit simultaneous color display from a 16-bit look-up table which means it can show more than one billion colors simultaneously. This is 64 times as many colors as you get with an 8-bit display which results in even smoother color gradations and reduced Delta-E between two adjacent colors. A graphics board and software which supports 10-bit output are also necessary for 10-bit displays."

I read this to say 10-bit output is necessary to enable wide gamut color display, i.e., a wider selection of colors. Am I reading this wrong? I believe every wide gamut monitor I have considered touts 10-bit color, and I have assumed it is the key to its wide gamut display. Am I to understand LR can provide a wide gamut display with 8-bit color only?