Lightroom: Add 10-bit support for end-to-end 10-bit workflow. That is, Application->Driver->Graphics Card->Monitor for true 10-bit color. At the moment, all that's missing is the application. CS5 supports 10-bit color, why not Lightroom?
simonsaith • Adobe Employee, May 09, 2016May 09, 2016
Steve is correct.
On Mac: Lightroom has always uses the 16-bit math throughout in the develop module (with ProPhotoRGB as the internal working color space). With the GPU acceleration, it requires at least Lr 6.2 and later.
On Windows: It also uses the same 16-bit ProPhotoRGB math for internal image renders. But the result image data of the final monitor color transform (for display) is currently limited to 8-bit.
The NEC PA271Q and PA272W monitors use an 8 bit/color panel with FRC dithering to achieve 10 bit/color depth, which is 30 bit RGB.....same as the iMac Pro.
So we do. Have quite a few genuine 10 bit panels out there. From what I understand is that it gives a truer rendition of the image without the gaps created by the inability to show intermediate colours, whether we can actually see them or not 🙂
It's debatable whether you can actual see the difference between 10 bit/color and 8 bit/color with FRC dithering. However, depending on how the dithering is implemented it may cause eyestrain for some users. I've been using my NEC PA272w monitor with 8 bit/color + FRC dithering for three years now with no issues.
Keep in mind that the FRC dithering is only used by the monitor with 10bit/color enabled applications. Currently (for me) only PS has this capability, which is used for no than 10% of my daily screen time (Web, Email, Editing, etc.).
Todd, I meant to suggest with my Viewsonic comment that a reputable manufacturer would specify if it uses 8 bit + AFC to achieve a 10-bit equivalent. I am surprised that NEC and presumably others would not, and it is yet another complication. I am thankful to learn of the displayspecifications website and that knowledgeable folks rely on it. Tell me if I am naive to think that.
I'm finding conflicting reports about the NEC PA271Q's panel. I think it might actually be a true 10-bit panel and displayspecifications might need to update their site/verify with NEC.
Agreed. I don' know why it should be so hard to find the "real" panel specification and not just stated as 10 bit/color 1.07 Billion colors. I found two search pages at TFT Central for panel model lookup by monitor model with specifications. Unfortunately the Panel Search pulls up incomplete panel part numbers such as LM270WQ3 for the NEV PA272w and the even less precise 27"WS LG.Display AH-IPS for the NEC PA271Q.
Yea this panel is quite frustrating to find info for... I suspect this new one might actually be a 10 Bit panel. The "True 10 Bit Color" phrase in the press release leads me to believe that.
I'd love to know as I currently have a PA272W and it might be time to replace mine soon. It's starting to become less and less uniform. It lived a good life though.
Yes, happy anniversary, dear feature request. 10 years, that asks for a big fat cake, at least as big as the middle finger Adobe is giving us all this time (not that I am surprised).
But on the other side, during those 10 years, has really anything changed?
AMD and NVIDIA are still unable to make up their minds whether they should fully unlock 10bit output on consumer cards (GTX/RTX/Radeon/Vega). They came up with these half-assed solutions such as OK, maybe we enable 10bit for DirectX (looking at you, NVIDIA), or, let's give this new checkbox that will not do anything, but you would not really notice anyway (AMD).
10bit HDR movies/displays should be the driving force, but card manufacturers are still unwilling to give up their Quadro/RadeonPro exclusivity and are just making the whole situation even more confusing to all, but expert users. I don't believe developers are happy about this either. Photoshop support of the 10-bit workflow also did not improve significantly if at all.
I am not whining, I am just stating the facts as they are.
Again, happy 10-year anniversary everyone! Live long and prosper, so that one day...
Geforce cards do have 10bit/channel output (also for OpenGL) since at least a year or so. You don´t need a Quadro for that anymore. Even my old GTX 970 is able now to output 10bit/channel in Photoshop, which I can easily see and test. It´s night and day compared to 8 bit/channel. My new RTX 3060 can do it, as well.
But I support the request for 10bit/channel output for LR and ACR. Although the display dithering does a remarkable job.
What do you mean by "Photoshop support of the 10-bit workflow also did not improve significantly if at all." What should be improved?
thanks for the correction, indeed, it seems like 10-bit has been unlocked in the consumer drivers, that is great to see. We need to see at least one major raw image editor to jump the ship and I am sure the rest will follow soon. It would not surprise me all that much if the support is already there, just "hidden" waiting for wider adoption of 10bit workflows.