• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
75

P: Add 10-bit/channel color display support

LEGEND ,
Aug 04, 2011 Aug 04, 2011

Copy link to clipboard

Copied

Lightroom: Add 10-bit support for end-to-end 10-bit workflow. That is, Application->Driver->Graphics Card->Monitor for true 10-bit color. At the moment, all that's missing is the application. CS5 supports 10-bit color, why not Lightroom?

Idea No status
TOPICS
macOS , Windows

Views

3.4K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 1 Correct answer

Adobe Employee , May 09, 2016 May 09, 2016
Steve is correct.

On Mac: Lightroom has always uses the 16-bit math throughout in the develop module (with ProPhotoRGB as the internal working color space). With the GPU acceleration, it requires at least Lr 6.2 and later.

On Windows: It also uses the same 16-bit ProPhotoRGB math for internal image renders. But the result image data of the final monitor color transform (for display) is currently limited to 8-bit.

Votes

Translate

Translate
66 Comments
Explorer ,
May 10, 2016 May 10, 2016

Copy link to clipboard

Copied

Also remembering that lightroom manages the color profiles on windows (good thing too as windows is - still -the village idiot when it comes to color management), so lightroom *could* do conversion in the best order to drive an 8 bit monitor. Having said that, even with dithering, there is a distinct (visible) quality improvement on going from
  • 10 bit source to 8 bit monitor interface
    -to-
  •  10 bit source to 10 bit monitor interface
Even the projector used by my camera club is 10 bit capable and is visibly better with full 10 bit interface.

This seems to apply just as much to nature (trees, leaves, stone etc. - especially when distant) as it does to artificial patterns. I assume that our internal image processing pipeline has evolved to pull as much detail as possible from the photons entering the eyeball.

There is also some interesting stuff happening around video as well as the extended video formats arrive. It is obvious that the eye brain processing that goes on picks up on some remarkably fine detail given the opportunity, The difference between yuyv (4:2:2) video and ful yuv (4:4:4) can be quite dramatic, creating a richer and more immersive image without apparently 'looking' much different. In some ways the result is similar to moving from 8 bit to 10 bit monitor depth.

Votes

Translate

Translate

Report

Report
Adobe Employee ,
May 10, 2016 May 10, 2016

Copy link to clipboard

Copied

Steve, when saving 8-bit images to disk or opening them in Photoshop from ACR or Lightroom, the quantization to 8 bits is always done as the final step.  Any color space conversions (e.g., from raw to Adobe RGB or sRGB) is done before that.

When displaying images to the screen inside of ACR or Lr's Develop module, the exact math and precision level will depend on a few things (including CPU vs GPU use), but the color space conversion to the display profile happens before any needed quantization to the display precision.  For example, for a "standard" 8-bit display pipeline we first convert to the display profile (with >= 16-bit math) and then quantize to 8 bits (with dither).

Votes

Translate

Translate

Report

Report
LEGEND ,
May 10, 2016 May 10, 2016

Copy link to clipboard

Copied

Ok that's good.  I interpreted Simon's statement:  "But the result image data of the final monitor color transform (for display) is currently limited to 8-bit" incorrectly to mean that the monitor profile conversion happened in 8-bit.

Votes

Translate

Translate

Report

Report
Participant ,
May 11, 2016 May 11, 2016

Copy link to clipboard

Copied

So if I have LR 6.5.1 on Windows, will I get 10-bit display output if I have the right graphics card?  If yes, can anyone recommend a suitable card.  I don't want to spend "too little" but I don't need the crazy-fast performance that gamers crave.

Thanks.

Votes

Translate

Translate

Report

Report
LEGEND ,
Sep 07, 2016 Sep 07, 2016

Copy link to clipboard

Copied

With current version of LR, no chance of having 10 bit display. With Photoshop, a Nvidia Quadro  K1200 is more than enough. Only workstation cards like Quadro enable 10 bit/channel through OpenGL (that is what Photoshop supports). 10 bit display on LR would be very welcome, they lag considerably behind Photoshop team.

Votes

Translate

Translate

Report

Report
LEGEND ,
Dec 08, 2016 Dec 08, 2016

Copy link to clipboard

Copied

No need Quadro anymore. Mine is an older GTX670, and it supports 10bit now. Go to NVidia Control Panel and there is an option to select color depth. You need a 10 bit capable monitor and most importantly connect with Displayport cable.

Votes

Translate

Translate

Report

Report
LEGEND ,
Dec 09, 2016 Dec 09, 2016

Copy link to clipboard

Copied

Nope. You do need a Quadro.
You can enable the color depth option in your GTX670, but you will still see banding on your 10 bit capable monitor connected with Displayport  in Photoshop (check it with "10bit test ramp" file).
Geforce cards can output 10 bit color but they only support that with applications that use Direct X. Quadro cards support 10 bit color via Open GL. Photoshop uses Open GL so you need a Quadro card for 10 bit with Photoshop.
More info here:
http://nvidia.custhelp.com/app/answers/detail/a_id/3011/~/10-bit-per-color-support-on-nvidia-geforce...

Votes

Translate

Translate

Report

Report
LEGEND ,
Dec 11, 2016 Dec 11, 2016

Copy link to clipboard

Copied

Thanks for the info. It's a shame that NVidia refuses to support 10 bit OpenGL on mainstream cards. I got a feeling that INTEL, MS, NVidia, ATI are the ones who stop the PC from progress. 

Votes

Translate

Translate

Report

Report
LEGEND ,
Dec 22, 2016 Dec 22, 2016

Copy link to clipboard

Copied



Application is the only thing missing in true 10-bit workflow now.

Votes

Translate

Translate

Report

Report
LEGEND ,
Jan 05, 2017 Jan 05, 2017

Copy link to clipboard

Copied

Add please 10 and 12 bit  per channel output to driver.
My monitor can handle 12 bit per channel and only Photoshop can use it within 30 bit option.

Votes

Translate

Translate

Report

Report
LEGEND ,
Jan 05, 2017 Jan 05, 2017

Copy link to clipboard

Copied

There is no 12bit panel. Some TV may be able to accept 12bit data, but displaying 12bit color is another story.

Votes

Translate

Translate

Report

Report
New Here ,
Apr 13, 2017 Apr 13, 2017

Copy link to clipboard

Copied

After reading this thread I summarize it as:

Even when having a pro 10bit capable display AND a pro 10bit graphics card like NVIDIA quattro or ATI FirePro you can only use 8bit color depth in lightroom. At least when you're on windows.

Is that correct? And if yes - WHY? I know I might not be able to tell the difference between two nearly identical colors, no matter whether I have 8bit or 10bit per channel BUT I indeed see the difference when having a highly increasing dynamic range. Compare a 256 shades (8bit) greyscale to a 1024 shades (10bit) greyscale - that's a massive (visible) difference.

Votes

Translate

Translate

Report

Report
Participant ,
Dec 21, 2017 Dec 21, 2017

Copy link to clipboard

Copied

I'm in the process of considering upgrading my computer/monitor(s)/printer to enable full 30-bit workflow. I am currently using a PC with Windows 7 SP1, Radeon R9 390 graphics card and two 8-bit (6+2 in reality) 23" Dell monitors as well as Lightroom 6. I am relatively new to Lightroom 6 and still learning, but really like it so far - very easy and intuitive software. Having just spent $150 on my single, perpetual license I am reluctant to change to new photo editing software again.

So, it is clear that Lightroom 6 will not support 10-bit display output with Windows, not even the latest standalone version 6.14. However, after reading this thread I am left with the impression (rightly or wrongly) that Lightroom 6 will support 10-bit final monitor color transform (for display) for a Mac using 4k or 5k 10-bit display with latest OS. Is this impression correct?

What I am really trying to understand here is if there is a distinct advantage to changing to a Mac with 4k-5k 10-bit display as opposed to upgrading my existing Windows based system? Will I unlock potential for a true 30-bit end to end workflow with Lightroom 6 or do I need to change to Photoshop even with a Mac?

Looking at this from a different angle, will I be able to perceive differences in photos generated by Lightroom 6 with a true end-to-end 30-bit system compared to a 24-bit system? How much will this in practice affect color range (i.e., ability to utilize and display sRGB vs aRGB spaces), eliminating banding in softly graduated tones, etc. Put another way does Lightroom 6 already somehow compensate for these issues in the event that it is in fact restricted to 8-bit output even with Mac?

Votes

Translate

Translate

Report

Report
Explorer ,
Oct 28, 2018 Oct 28, 2018

Copy link to clipboard

Copied

When Adobe runs out of all the ideas for LR and when the whole and often misunderstood HDR hype takes over the world, then and only then this now "nobody needs this" feature will become "essential must have". Many modern image formats, including HEIF do have, unlike JPEG, more than 8bit per channel support. In the situation where you phone can display it but your mega rig with latest Lightroom can not, we might see some change.
Than being said, my latest Win10 with Quadro Card and 10bit display just would not get 10bit through in any application no matter what. It worked before, but not anymore.  

Votes

Translate

Translate

Report

Report
LEGEND ,
Oct 29, 2018 Oct 29, 2018

Copy link to clipboard

Copied

Sad, indeed. I'm looking to move away from LR/Adobe since that Lightroom Classic, Lightroom CC, Lightroom bitchmove, anyway. 

Votes

Translate

Translate

Report

Report
New Here ,
Feb 09, 2019 Feb 09, 2019

Copy link to clipboard

Copied

What would be interesting to know is WHY Adobe has limited Lightroom in this way. There must be a justifiable reason otherwise I am sure 10 bit would have been enabled. However, it is definitely a desirable attribute and I would add my voice to the request for a full 10 bit monitor output.

Votes

Translate

Translate

Report

Report
Explorer ,
Feb 10, 2019 Feb 10, 2019

Copy link to clipboard

Copied

I think reason one is only a teeny-tiny portion of LR users is actually capable of running 10bit display workflow.
Reason two: they did not really make it work even in Photoshop yet (it works at particular zoom levels without any tools, including selection / marque tool, being used).
Reason three: your human vision: on a photograph it would be a challenge to see difference from a viewing distance between dither and true 10bit - UNLESS in a specific use case scenario (black and white photography, monochromatic photography of whatever color).
Reason four: as it was mentioned, internally all the calculations are 16/32 bit so the display depth is not a limit
Reason five: possible performance taxation on GPU acceleration. Essentially: GPU acceleration would not be able to share the same code for Radeon/GeForce with FirePro/Quadro cards. 

And at last but not at least: you would have to hire extra support staff just to trouble-shoot people's buggy 10bit workflows over which you have very little control but in the end the trouble will land in Lightroom.

Do I want 10bit in Lightroom? Hell yes!
Do I need it? Hm....
Would I will be able to see the difference on my pictures? Oh, I wish...
Are there any other more important issues I would benefit from more if Lightroom got them addressed? DON'T GET ME STARTED.... 😄

Get a high quality wide-gamut display with hardware calibration and with guaranteed uniform back-light (SpectraView or equivalent).
Make sure you have means of monthly HW calibration.
Secure a high-quality photo-printing workflow with custom calibrated printer for all the media you plan to use.
Now, you have done a lot for a high quality output. 99% percent.
You want that extra one percent, that is your 10bit display in Lightroom (and believe me, there are way better apps with much deeper control to print from than Lightroom).

Votes

Translate

Translate

Report

Report
Community Beginner ,
Mar 02, 2019 Mar 02, 2019

Copy link to clipboard

Copied

I agree that a wide gamut monitor is important, but as near as I can tell they all promote 10-bit color as an integral part of their improved display. The number of photos I would edit would be much greater that the number I would print. I would not put a printer in front of 10-bit color.

Votes

Translate

Translate

Report

Report
New Here ,
Mar 02, 2019 Mar 02, 2019

Copy link to clipboard

Copied

Very few manufacturers promote true 10 but monitors. Mostly Eizo, NEC and Benq. I don't understand what you mean by not putting a printer in front of 10 bit colour.

Votes

Translate

Translate

Report

Report
Community Beginner ,
Mar 03, 2019 Mar 03, 2019

Copy link to clipboard

Copied

I was referencing Skippi's hierarchy of importance where he saw purchasing a custom calibrated printer as something that should be done prior to purchasing a 10-bit color capable wide gamut monitor. That would only hold if you printed every photo you edited. For most, the monitor would be used for editing a much larger number of photos than what would be printed, it seems to me. Also, it is not unusual to farm out the printing to a specialist.

There's no doubt that the vast majority of monitors in use are not 10-bit color wide gamut. But they are considered the best for a variety of reasons, and I am convinced of those reasons. It seems obvious, if Adobe sees itself as the best editing software, it needs to support the best monitors. I am not a programmer, but I assume supporting 10-bit color monitors would not prevent folks with standard monitors from using the programs.

Votes

Translate

Translate

Report

Report
LEGEND ,
Mar 03, 2019 Mar 03, 2019

Copy link to clipboard

Copied

Very few manufacturers promote true 10 but monitors. Mostly Eizo, NEC and Benq.
All of these manufacturers use 8 bit/color LCD screen panels with FRC dithering to achieve 10 bit/color and a 16 bit/color internal LUT for calibration. This sets them apart from lesser displays that need to be calibrated using the monitor's external controls and profiled  using the 8 bit/color LUT in the graphics card. This reduces the displays effective bit depth.

More here on the subject:

https://forums.adobe.com/message/10958873#10958873



Votes

Translate

Translate

Report

Report
Explorer ,
Mar 03, 2019 Mar 03, 2019

Copy link to clipboard

Copied

First of all, glad the conversation on this topic is going on. 10bit display is deeply in my heart and I would love to see it just as you do - if not more, cause I have a 10bit workflow on my desk for over three years now.

@dennis> I was referring to the printer because making prints significantly improves your photography. Nobody prints every picture they take. But spending time on finessing this very last step (OK, the very last step is framing and display) will get you thinking about your picture a lot more.

Based on my experience (and the experience of my friends who are gallery specialists doing scans of artworks priced at 6+ figures and saving them in 16bit uncompressed TIFF files), if I put it in laymen terms: For general photography, 10bit output is a p0rn only YOUR MIND can appreciate. With the LUT tables, high precision internal processing and properly calibrated screen, the difference between FRC dither and true 10bit output is indiscernible to your vision. Yes, you can tell in artificial monochromatic gradients. And (maybe) yes a highly skilled radiologist would be able tell the monitors apart. But for the rest of the cases...

todd@vpb> You are misinterpreting the facts. Yes, all of those manufacturers use 8bit LCD panels. But, for their premium priced prosumer and higher lines of products they also have 10bit LCD panels with the micro controller really sending the 10bit value to the subpixel and really being able to display all the discreet levels of brightness in each of the subpixels does not requiring any dithering etc.

Actually, going 10bit is NOT that hard. What is hard (expensive) is achieving more-less linear wide gamut in a useful spectrum of colors. What is even harder and even more expensive is achieving uniform back-lighting and acceptable black levels on LCD screens. But without these, 10bit would make even less sense than it does now.

Votes

Translate

Translate

Report

Report
Community Beginner ,
Mar 03, 2019 Mar 03, 2019

Copy link to clipboard

Copied

I'm not certain that is correct, Todd. I know that Viewsonic gives its color space support spec as 8-Bit+A-FRC, with FRC being dithering that adds a 2 bit equivalent.

Votes

Translate

Translate

Report

Report
LEGEND ,
Mar 04, 2019 Mar 04, 2019

Copy link to clipboard

Copied

*Skippi said:
Yes, all of those manufacturers use 8bit LCD panels. But, for their premium priced prosumer and higher lines of products they also have 10bit LCD panels
Interesting, please provide links to specific models so we can give them consideration. Thank you Skippi.

*Dennis Hyde said:

I'm not certain that is correct, Todd. I know that Viewsonic gives its color space support spec as 8-Bit+A-FRC, with FRC being dithering that adds a 2 bit equivalent.
That's exactly what I said, "All of these manufacturers use 8 bit/color LCD screen panels with FRC dithering to achieve 10 bit/color."
What's not correct?

Votes

Translate

Translate

Report

Report
Explorer ,
Mar 04, 2019 Mar 04, 2019

Copy link to clipboard

Copied

todd@vpb



See the NEC Manual: http://www.support.nec-display.com/dl_service/data/display_en/manual/pa271q/PA271Q_manual_EN_v4.pdf

The iMac Pro has 10Bit with dithering but it's also advertised as such.


Edit:

I've found some conflicting reports. NEC states in this press release that the PA271Q features  true 10 bit color. https://www.necdisplay.com/about/press-release/nec-display-updates-popular-professional-desktop-d/79...

However this site states the panel is 8bit + FRC: https://www.displayspecifications.com/en/model/73ae1389

This other model though is 10bit 
https://www.displayspecifications.com/en/model/178b1656

Votes

Translate

Translate

Report

Report