Skip to main content
Inspiring
August 5, 2011
Open for Voting

P: Add 10-bit/channel color display support

  • August 5, 2011
  • 67 replies
  • 4970 views

Lightroom: Add 10-bit support for end-to-end 10-bit workflow. That is, Application->Driver->Graphics Card->Monitor for true 10-bit color. At the moment, all that's missing is the application. CS5 supports 10-bit color, why not Lightroom?

67 replies

MadManChan2000
Adobe Employee
Adobe Employee
May 10, 2016
Steve, when saving 8-bit images to disk or opening them in Photoshop from ACR or Lightroom, the quantization to 8 bits is always done as the final step.  Any color space conversions (e.g., from raw to Adobe RGB or sRGB) is done before that.

When displaying images to the screen inside of ACR or Lr's Develop module, the exact math and precision level will depend on a few things (including CPU vs GPU use), but the color space conversion to the display profile happens before any needed quantization to the display precision.  For example, for a "standard" 8-bit display pipeline we first convert to the display profile (with >= 16-bit math) and then quantize to 8 bits (with dither).
Inspiring
May 10, 2016
Also remembering that lightroom manages the color profiles on windows (good thing too as windows is - still -the village idiot when it comes to color management), so lightroom *could* do conversion in the best order to drive an 8 bit monitor. Having said that, even with dithering, there is a distinct (visible) quality improvement on going from
  • 10 bit source to 8 bit monitor interface
    -to-
  •  10 bit source to 10 bit monitor interface
Even the projector used by my camera club is 10 bit capable and is visibly better with full 10 bit interface.

This seems to apply just as much to nature (trees, leaves, stone etc. - especially when distant) as it does to artificial patterns. I assume that our internal image processing pipeline has evolved to pull as much detail as possible from the photons entering the eyeball.

There is also some interesting stuff happening around video as well as the extended video formats arrive. It is obvious that the eye brain processing that goes on picks up on some remarkably fine detail given the opportunity, The difference between yuyv (4:2:2) video and ful yuv (4:4:4) can be quite dramatic, creating a richer and more immersive image without apparently 'looking' much different. In some ways the result is similar to moving from 8 bit to 10 bit monitor depth.
ssprengel
Inspiring
May 10, 2016

This answer is less specific than the previous one, so let me be clear about what I'm saying, and yes it is about the conversion of the 16-bit ProPhotoRGB colorspace to the 8-bit monitor colorspace of Windows which would be close to either AdobeRGB for wide-gamut monitors or sRGB for non-wide-gamut monitors, and neither is as wide as ProPhotoRGB.

It was the particular ordering of the color-space conversion and the 16-bit to 8-bit truncation that seemed wrong to me.  Let me explain in more detail so there is no confusion:

If I was doing the 16-bit ProPhotoRGB to 8-bit monitor-RGB in PS I would do it in two steps:

1)  16-bit ProPhotoRGB (64K colors per RGB channel) to 16-bit sRGB (for example) 64K minus the colors out of gamut which is less but still many more than 256 colors per channel).

2)  16-bit sRGB to 8-bit sRGB - 256 colors per channel


Now if from what you said happens in the CPU computations of internal LR colors to monitor colors, this is the order:

1)  16-bit ProPhotoRGB (64K colors per RGB channel) to 8-bit ProPhotoRGB (256 colors per RGB channel).

2)  8-bit ProPhotoRGB (256 colors per RGB channel) to 8-bit sRGB (256 colors per RGB channel minus the out-of-gamut colors for sRGB, so less than 256 colors per RGB channel).

So in the optimal way we get 256 colors per each of the three RGB channels.

And in the non-optimal way we get less then 256 colors per channel--only the colors left after the PPRGB colors that are out-of-gamut in sRGB are thrown away. 

Each level in each channel leads to a band on the screen.  In a smooth gradient such as the sky, those bands are visible.  With less than 256-colors per RGB channel there are fewer bands and they are more prominent.

So my question is, why is this non-optimal conversion of 16-bit PPRGB to 8-bit monitor-RGB being done?   Or is the analysis of it being non-optimal for the colors-per-channel not right, or is the interpretation of what you said happens not right and the ordering is actually the optimal ordering--colorspace conversion occurs in 16-bits in both the GPU and CPU and then the CPU conversion truncates to 8-bits afterwards?
---
As an appendix, here is a real-world example comparing the number of colors in a document converted optimally and non-optimally from 16-bit ProPhotoRGB to 8-bit sRGB. 

To create this example document, I drag-sized the PS color-palette to maybe 1/6th of the screen, then did a screenshot, copy-pasted that screenshot into a PS document, converted to 16-bits, assigned my monitor profile, converted to ProPhotoRGB, bi-linear resampled up to 1024x1024 pixels, g-blurred it by 8-pixels, increased the saturation by 20, g-blurred it by 8-pixel radius again, resulting in the starting document with the following properties:

1024x1024 dimensions = 1048576 total pixels, 1045410 total colors as counted by this plug-in: http://telegraphics.com.au/sw/product/CountColours

If I convert optimally from 16-bit PPRGB to 16-bit sRGB then to 8-bits there are 305598 colors.

If I convert non-optimally from 16-bit PPRGB to 8-bits then convert to sRGB there are 165603 colors, which is a bit more than half as many colors compared to doing it the optimal way.

I had dithering turned off for the ProPhotoRGB to sRGB color profile conversion assuming this is what happens when converting to the monitor profile via the CPU to make it faster.  If not then please enlighten me.

Adobe Employee
May 10, 2016
Maybe I did not make it clear. 

Lr always uses the ProPhotoRGB color space and 16-bit math internally for the processing (it will use floating point math if the source is a HDR). This is true for both the CPU and GPU renders. In general, Lightroom is also responsible for the final color transform from the 16-bit ProPhotoRGB to the destination monitor color space to avoid the sub-optimal result as you mentioned. The only exception to this is the CPU rendering on Mac, where Lightroom would leave it to the Mac OS X to do the final monitor display color transform. In that case, Lightroom is passing the 16-bit ProPhotoRGB image data directly to the Mac OS X so there is no loss of fidelity there. On Windows, the image data as a result of the final monitor color transform is currently limited to 8-bit. On Mac, it is 16-bit.
ssprengel
Inspiring
May 9, 2016
So the CPU-rendering is non-optimal, why?
Adobe Employee
May 9, 2016
For the GPU rendering path, the display color transform is entirely handled within the internal OpenGL rendering pipeline. So the behavior there is just as you have expected. The 8-bit display color transform path is only applicable to the legacy CPU rendering path on Windows only. 
ssprengel
Inspiring
May 9, 2016

So on Windows LR truncates the 16-bit PPRGB to 8-bit PPRGB then transforms to 8-bit "Monitor RGB"?    Isn't this non-optimal and will lead to unnecessary banding and posterization?

If I was converting color spaces in Photoshop from 16-bit PPRB to an 8-bit narrower monitor colorspace, I'd transform to the monitor colorspace while still in 16-bit then truncate to the monitor color-depth.

Is this non-optimal transform being done for speed purposes or because the GPU can't do the transform in 16-bit?  Or am I wrong that the order of the color-space conversion and bit-depth-conversion makes a difference?

Adobe Employee
May 9, 2016
Internal image pipeline process it in 16-bit on Windows also. Since the thread is about color display, so my comment above is purely about the last stage of display color transform for the monitor.
Known Participant
May 9, 2016
Simon,
I guess I don`t understand. You mean LR is rendering in only 8bit on Windows??? I`m not talking about display, but the internal processing.
Adobe Employee
May 9, 2016
On Windows, it is still 8-bit ProPhotoRGB.