• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
75

P: Add 10-bit/channel color display support

LEGEND ,
Aug 04, 2011 Aug 04, 2011

Copy link to clipboard

Copied

Lightroom: Add 10-bit support for end-to-end 10-bit workflow. That is, Application->Driver->Graphics Card->Monitor for true 10-bit color. At the moment, all that's missing is the application. CS5 supports 10-bit color, why not Lightroom?

Idea No status
TOPICS
macOS , Windows

Views

3.4K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 1 Correct answer

Adobe Employee , May 09, 2016 May 09, 2016
Steve is correct.

On Mac: Lightroom has always uses the 16-bit math throughout in the develop module (with ProPhotoRGB as the internal working color space). With the GPU acceleration, it requires at least Lr 6.2 and later.

On Windows: It also uses the same 16-bit ProPhotoRGB math for internal image renders. But the result image data of the final monitor color transform (for display) is currently limited to 8-bit.

Votes

Translate

Translate
66 Comments
Mentor ,
Aug 04, 2011 Aug 04, 2011

Copy link to clipboard

Copied

"At the moment, all that's missing is the application."

There are thousands of feature requests, so obviously that isn't all.

Votes

Translate

Translate

Report

Report
Contributor ,
Aug 04, 2011 Aug 04, 2011

Copy link to clipboard

Copied

He means that the driver, graphics card, and monitor supports 10 bit color. Only the app is missing 10 bit support.

OS X doesn't support 10 bit, though. Windows does.

Votes

Translate

Translate

Report

Report
LEGEND ,
Aug 04, 2011 Aug 04, 2011

Copy link to clipboard

Copied

Also, we can only support 10 bit/channel through OpenGL - which requires a compatible card and driver version.

Votes

Translate

Translate

Report

Report
LEGEND ,
Aug 05, 2011 Aug 05, 2011

Copy link to clipboard

Copied

Correct, yes, the compatible card and driver are required, as well as a monitor.

My point is really:
If a user has invested in a 10 bit monitor, a 10 bit compatible card / driver version (e.g. FirePro from ATI) they should be able to get applications that support this.

The prime example being that CS5 does support 10 bit (if all the other pieces are in place) through OpenGL, so as a related product it would be great if Lightroom did too.

Votes

Translate

Translate

Report

Report
LEGEND ,
Jun 06, 2012 Jun 06, 2012

Copy link to clipboard

Copied

DSLR's are increasingly pushing the 8 bit envelope. Why would Adobe be content letting their software limit my photography?

Votes

Translate

Translate

Report

Report
LEGEND ,
Jun 06, 2012 Jun 06, 2012

Copy link to clipboard

Copied

The bit depth of the data doesn't matter here -- Lightroom already processes images in the depth of the camera or 32 bit floating point format. This topic is about supporting newer video cards and displays that can use 10 bit/channel framebuffers.

Votes

Translate

Translate

Report

Report
Enthusiast ,
Jun 07, 2012 Jun 07, 2012

Copy link to clipboard

Copied

Furthermore, bit depth of cameras (raw) and monitor output are not comparable at all: Camera bit depth refers to a linear tonal scale (gamma = 1) and monitor output to a non-linear scale (mostly gamma = 2.2).

Existing 8 bit output resolution will not really limit your photography (e.g. in terms of dynamic range). The advantage of 10 Bit is "only" that it gives you more tonal resolution on the display. For critical photos, this can improve display accuracy by reducing banding/posterization in regions with subtle tonal or color changes. So the feature is of course desirable, but not essential.

Votes

Translate

Translate

Report

Report
Enthusiast ,
Jun 07, 2012 Jun 07, 2012

Copy link to clipboard

Copied

Additional question: Does LR use dithering in the develop module? I just loaded the 10 bit test ramp from http://www.imagescience.com.au/kb/que... into Photoshop and LR (8 bit output). In Photoshop, I see the 8-bit-banding, as well as in LR loupe (because previews are 8 bit JPEGs). But in LR develop, there is no visible banding!

This would be actually great, because it is the next best thing to 10 it support.

Edit: Yes, LR develop really seems to use dithering. Here is a small portion of the ramp, with extremly amplified contrast (screenshot from develop module, contrast amplified in PS):

Votes

Translate

Translate

Report

Report
LEGEND ,
Jun 07, 2012 Jun 07, 2012

Copy link to clipboard

Copied

Yes, Develop and Export (to 8 bpp) dithers.
http://forums.adobe.com/message/3655226

Votes

Translate

Translate

Report

Report
Enthusiast ,
Jun 07, 2012 Jun 07, 2012

Copy link to clipboard

Copied

Thanks... yes, you already mentioned that here, too: http://feedback.photoshop.com/photosh... ... But I seem to have overlooked it, or simply forgotten ;-)

So... is it safe to assume that because of this, LR does not really need 10 bit output? I mean, if the dithering works so well on artificial images, it should work nearly perfectly on most real world photos. The only case when 10 bit would have an advantage is when there are sublte tonal *patterns* (e.g. fabrics, etc.) with small size (some pixels), which cannot be shown using dithering. But I doubt that such patterns can really be seen even with 10 bit.

Votes

Translate

Translate

Report

Report
LEGEND ,
May 22, 2014 May 22, 2014

Copy link to clipboard

Copied



I fully understand that you can't give Lightroom too many of the attributes that Photoshop has but a 10 Bit Monitor is just superb [using PSCC] and dare I say the defacto standard for sill digital image editing. It would be really nice to see Lightroom 6 having the same facility. It would also give the two Applications greater synergy when viewing/editing.

Votes

Translate

Translate

Report

Report
LEGEND ,
Jan 13, 2016 Jan 13, 2016

Copy link to clipboard

Copied

I am a very satisfied Lightroom user, but the lack of 30 bit color display capability it's a major shortcoming for a pro grade software.
I hope this feature will be added soon.

Votes

Translate

Translate

Report

Report
Guest
Jan 13, 2016 Jan 13, 2016

Copy link to clipboard

Copied



we, photographers, need Lightroom to support 10-bit display output for having our advance displays like models from Eizo or NEC producing 10-bit billion color depth

Votes

Translate

Translate

Report

Report
Explorer ,
Apr 09, 2016 Apr 09, 2016

Copy link to clipboard

Copied

Having just tested 10 bit support in video media with a good - but not ridiculous - rig and seen the difference it makes I am very keen to get 10 bit support on ALL my photo apps.

windows 10 - nvidia 980ti - Dell U2713H. I know this monitor is doing slightly fake 10  bit (as many laptops do fake 8 bit), but the improvement is significant.

Votes

Translate

Translate

Report

Report
Participant ,
May 08, 2016 May 08, 2016

Copy link to clipboard

Copied

My Nikon D3 produces RAWs with 14-bit color depth.  With 8-bit processing in LR, I'm throwing away 6 bits of color depth smoothness and gradation.  As others have pointed out, Photoshop has 10-bit color support, so it's not something that Adobe can't do. And nowadays, both Windows and OS X support 10-bit color.

Votes

Translate

Translate

Report

Report
LEGEND ,
May 08, 2016 May 08, 2016

Copy link to clipboard

Copied

I'm pretty sure that LR uses a 16-bit internal workspace, while the display is only 8-bits, so the term 8-bit-processing isn't quite right. 

Votes

Translate

Translate

Report

Report
Adobe Employee ,
May 09, 2016 May 09, 2016

Copy link to clipboard

Copied

Steve is correct.

On Mac: Lightroom has always uses the 16-bit math throughout in the develop module (with ProPhotoRGB as the internal working color space). With the GPU acceleration, it requires at least Lr 6.2 and later.

On Windows: It also uses the same 16-bit ProPhotoRGB math for internal image renders. But the result image data of the final monitor color transform (for display) is currently limited to 8-bit.

Votes

Translate

Translate

Report

Report
Adobe Employee ,
May 09, 2016 May 09, 2016

Copy link to clipboard

Copied

On Windows, it is still 8-bit ProPhotoRGB.

Votes

Translate

Translate

Report

Report
Engaged ,
May 09, 2016 May 09, 2016

Copy link to clipboard

Copied

Simon,
I guess I don`t understand. You mean LR is rendering in only 8bit on Windows??? I`m not talking about display, but the internal processing.

Votes

Translate

Translate

Report

Report
Adobe Employee ,
May 09, 2016 May 09, 2016

Copy link to clipboard

Copied

Internal image pipeline process it in 16-bit on Windows also. Since the thread is about color display, so my comment above is purely about the last stage of display color transform for the monitor.

Votes

Translate

Translate

Report

Report
LEGEND ,
May 09, 2016 May 09, 2016

Copy link to clipboard

Copied

So on Windows LR truncates the 16-bit PPRGB to 8-bit PPRGB then transforms to 8-bit "Monitor RGB"?    Isn't this non-optimal and will lead to unnecessary banding and posterization?

If I was converting color spaces in Photoshop from 16-bit PPRB to an 8-bit narrower monitor colorspace, I'd transform to the monitor colorspace while still in 16-bit then truncate to the monitor color-depth.

Is this non-optimal transform being done for speed purposes or because the GPU can't do the transform in 16-bit?  Or am I wrong that the order of the color-space conversion and bit-depth-conversion makes a difference?

Votes

Translate

Translate

Report

Report
Adobe Employee ,
May 09, 2016 May 09, 2016

Copy link to clipboard

Copied

For the GPU rendering path, the display color transform is entirely handled within the internal OpenGL rendering pipeline. So the behavior there is just as you have expected. The 8-bit display color transform path is only applicable to the legacy CPU rendering path on Windows only. 

Votes

Translate

Translate

Report

Report
LEGEND ,
May 09, 2016 May 09, 2016

Copy link to clipboard

Copied

So the CPU-rendering is non-optimal, why?

Votes

Translate

Translate

Report

Report
Adobe Employee ,
May 09, 2016 May 09, 2016

Copy link to clipboard

Copied

Maybe I did not make it clear. 

Lr always uses the ProPhotoRGB color space and 16-bit math internally for the processing (it will use floating point math if the source is a HDR). This is true for both the CPU and GPU renders. In general, Lightroom is also responsible for the final color transform from the 16-bit ProPhotoRGB to the destination monitor color space to avoid the sub-optimal result as you mentioned. The only exception to this is the CPU rendering on Mac, where Lightroom would leave it to the Mac OS X to do the final monitor display color transform. In that case, Lightroom is passing the 16-bit ProPhotoRGB image data directly to the Mac OS X so there is no loss of fidelity there. On Windows, the image data as a result of the final monitor color transform is currently limited to 8-bit. On Mac, it is 16-bit.

Votes

Translate

Translate

Report

Report
LEGEND ,
May 09, 2016 May 09, 2016

Copy link to clipboard

Copied

This answer is less specific than the previous one, so let me be clear about what I'm saying, and yes it is about the conversion of the 16-bit ProPhotoRGB colorspace to the 8-bit monitor colorspace of Windows which would be close to either AdobeRGB for wide-gamut monitors or sRGB for non-wide-gamut monitors, and neither is as wide as ProPhotoRGB.

It was the particular ordering of the color-space conversion and the 16-bit to 8-bit truncation that seemed wrong to me.  Let me explain in more detail so there is no confusion:

If I was doing the 16-bit ProPhotoRGB to 8-bit monitor-RGB in PS I would do it in two steps:

1)  16-bit ProPhotoRGB (64K colors per RGB channel) to 16-bit sRGB (for example) 64K minus the colors out of gamut which is less but still many more than 256 colors per channel).

2)  16-bit sRGB to 8-bit sRGB - 256 colors per channel


Now if from what you said happens in the CPU computations of internal LR colors to monitor colors, this is the order:

1)  16-bit ProPhotoRGB (64K colors per RGB channel) to 8-bit ProPhotoRGB (256 colors per RGB channel).

2)  8-bit ProPhotoRGB (256 colors per RGB channel) to 8-bit sRGB (256 colors per RGB channel minus the out-of-gamut colors for sRGB, so less than 256 colors per RGB channel).

So in the optimal way we get 256 colors per each of the three RGB channels.

And in the non-optimal way we get less then 256 colors per channel--only the colors left after the PPRGB colors that are out-of-gamut in sRGB are thrown away. 

Each level in each channel leads to a band on the screen.  In a smooth gradient such as the sky, those bands are visible.  With less than 256-colors per RGB channel there are fewer bands and they are more prominent.

So my question is, why is this non-optimal conversion of 16-bit PPRGB to 8-bit monitor-RGB being done?   Or is the analysis of it being non-optimal for the colors-per-channel not right, or is the interpretation of what you said happens not right and the ordering is actually the optimal ordering--colorspace conversion occurs in 16-bits in both the GPU and CPU and then the CPU conversion truncates to 8-bits afterwards?
---
As an appendix, here is a real-world example comparing the number of colors in a document converted optimally and non-optimally from 16-bit ProPhotoRGB to 8-bit sRGB. 

To create this example document, I drag-sized the PS color-palette to maybe 1/6th of the screen, then did a screenshot, copy-pasted that screenshot into a PS document, converted to 16-bits, assigned my monitor profile, converted to ProPhotoRGB, bi-linear resampled up to 1024x1024 pixels, g-blurred it by 8-pixels, increased the saturation by 20, g-blurred it by 8-pixel radius again, resulting in the starting document with the following properties:

1024x1024 dimensions = 1048576 total pixels, 1045410 total colors as counted by this plug-in: http://telegraphics.com.au/sw/product/CountColours

If I convert optimally from 16-bit PPRGB to 16-bit sRGB then to 8-bits there are 305598 colors.

If I convert non-optimally from 16-bit PPRGB to 8-bits then convert to sRGB there are 165603 colors, which is a bit more than half as many colors compared to doing it the optimal way.

I had dithering turned off for the ProPhotoRGB to sRGB color profile conversion assuming this is what happens when converting to the monitor profile via the CPU to make it faster.  If not then please enlighten me.

Votes

Translate

Translate

Report

Report