Skip to main content
Chetroni
Participant
October 22, 2016
Question

nVidia graphic card for 10 bit/color in Photoshop

  • October 22, 2016
  • 4 replies
  • 34376 views

Hello,

Please tell me what GPU/graphic card do I need to use 10bit/color in photoshop?

I own a 10bit/color photography display and I want to use it at full pottential.

I don't do other proffesional stuff (CAD, 3D, etc), only photography in Photoshop.

Is there any GeForce card that can help me? or only Quadro cards?

I will appreciate your quick reply.

Best regards,

Daniel.

    This topic has been closed for replies.

    4 replies

    Participating Frequently
    July 29, 2019

    NVIDIA DRIVERS NVIDIA Studio Driver

    Finally: Adds support for 30-bit color on GeForce and TITAN GPUs

    D Fosse
    Community Expert
    Community Expert
    August 12, 2017

    Just a small update here. I built a new Win 10 Pro machine, and migrated the Quadro K420 from the old system. This card is low-end as Quadros go and not particularly expensive, about the same as a GTX1050. It only has one DP and one DVI port, though.

    As far as I can tell from about a week's use, 30-bit display now works reliably and consistently in Photoshop. Using the gradient tool I can immediately see the difference between 8 and 16 bit files on the Eizo. You still need to be at 66.67% zoom or above, but other than that all I needed to do was tick the 30-bit box in PS preferences.

    On the old Win 7 system it was very erratic. It sometimes worked, but mostly not. I did download a fresh driver this time, 385.08, and installed only the base driver leaving everything else unchecked (in case that matters).

    Participant
    August 13, 2017

    Excellent point D Fosse. You need not break the bank for ma 30 bit GPU or Monitor these days. You're also correct about Win 7 being a constant challenge with NVIDEA drivers.

    "Using the gradient tool I can immediately see the difference between 8 and 16 bit files on the Eizo."

    Amazing how obvious it becomes and you will see a difference in the work.

    That annoying 66.7% is a driver issue and that really irks me to no end. I went what people call the "low to mid GPU for Adobe" and can go down to 16% with the QUATRO K4200. Your K420 should be able to do the same for zoom at 10 bit, but you might get a bunch of techno babble thrown at you, which, upon translation becomes "upgrade your GPU!"

    Good luck,

    Dean

    davescm
    Community Expert
    Community Expert
    August 13, 2017

    Hi Dean

    I doubt that any driver would overcome the 66.7 % zoom issue in Photoshop. It is caused by Adobe using 8 bit previews to blend at zoom levels below that.

    Probably a very sensible decision made years ago to speed up screen redraw but it does cause all kinds of issues even when viewed on 8 bit monitors as tje blending between layers is changed.

    This came out of a discussion with some very knowledgeable folk who used to post here. There is probably a case for doing it differently now and at least one feature request was submitted (wish I could find it) but I would dread to think what kind of rewrite it would involve as blending previews really is at the core of Photoshop.

    Dave

    craigjohn
    Participating Frequently
    May 26, 2017

    According to this article, Nvidia GeForce cards now support 10 bit color.

    Nvidia Confirms GTX 1070 Specs -1920 CUDA Cores & 1.6Ghz Boost Clock At 150W

    May 17, 2016

    The cards also finally, a first for any GeForce products, support 10-bit per color channel. Something that the Radeon counterparts had for years. So if you have a color rich 10-bit per channel monitor you’ll finally be able to enjoy it on GeForce. Blacks will be absolutely black, reds greens and blues incredibly more accurate. This includes support for upcoming High Dynamic Range monitors as well.


    So I think you'd be safe with the GeForce 1070 and later versions.

    D Fosse
    Community Expert
    Community Expert
    May 26, 2017
    Something that the Radeon counterparts had for years. So if you have a color rich 10-bit per channel monitor you’ll finally be able to enjoy it on GeForce. Blacks will be absolutely black, reds greens and blues incredibly more accurate. This includes support for upcoming High Dynamic Range monitors as well.

    Er...this is obviously written by someone who has no clue whatsoever what 10-bit support is in the first place.

    "Blacks will be absolutely black, reds greens and blues incredibly more accurate" Huh? Say again? That has absolutely nothing to do with bit depth - none, zero. All you ever get with a 10 bit display pipeline, is smooth, stepless gradients. That's all there is to it.

    And no - Radeons have not had 10 bit support for years. They don't support it now, and never have. Only the FirePro line ever supported 10 bit depth.

    And what an HDR monitor is, is anybody's guess. It would certainly be of no use in proofing for print, where today's monitors already have far too much dynamic range, so that you have to dial it down considerably to be of any practical use.

    In short - this isn't very credible.

    D Fosse
    Community Expert
    Community Expert
    May 26, 2017

    Going to the specs released from NVidia, there's no mention of 10 bit support there, anywhere. They probably confused it with something else with the number "10" in it.

    I got a bit curious about what an HDR monitor would be. The specs promised include a brightness level of 1000 cd/m² !!! Have your goggles ready, gentlemen. That's suntan-level radiation. I wouldn't want to be within a fifty feet radius of such a monster....

    And a matching contrast ratio of 10 000 : 1. Yes, that's ten thousand to one. Since a premium-quality inkjet print has a contrast ratio of less than 300 : 1, this is bound to improve your work no end, I'm sure.

    D Fosse
    Community Expert
    Community Expert
    October 22, 2016

    It has to be a Quadro. I just got a Quadro K420, which btw isn't particularly expensive. The first day 10-bit display worked right out of the box, didn't have to do anything but flip the 30 bit switch in PS preferences.

    The next day it would only work at 66.67% zoom and up.

    The third day it wouldn't work at all, and hasn't since.

    Not that it matters much. With a good monitor you can't tell the difference, unless you pull up very special test gradients. With an actual photograph there's always just enough noise so that it's impossible to discern any banding, even in the smoothest gradients.

    Chetroni
    ChetroniAuthor
    Participant
    October 22, 2016

    And besides the 8 / 10 bit difference between geforce and quadro, there are other differences in photoshop between those two?

    Wich one will help the CPU to render zoom, blur, etc?

    Any idea?

    Trevor.Dennis
    Community Expert
    Community Expert
    October 22, 2016

    Chetroni wrote:

    And besides the 8 / 10 bit difference between geforce and quadro, there are other differences in photoshop between those two?

    Wich one will help the CPU to render zoom, blur, etc?

    Any idea?

    Puget Systems is my go to site for that sort of information.  They don't have any recent articles pertaining directly to Photoshop GPU performance, but there are some interesting reads there.

    This is the most recent article I found that is fully relevant  (September 2012)

    https://www.pugetsystems.com/labs/articles/Adobe-Photoshop-CS6-GPU-Acceleration-161/

    Photoshop memory optimization

    https://www.pugetsystems.com/labs/articles/Adobe-Photoshop-CS6-Memory-Optimization-182/

    Premiere Pro GPU acceleration

    https://www.pugetsystems.com/labs/articles/Adobe-Premiere-Pro-CC-Professional-GPU-Acceleration-502/

    GTX1080 and GTX1070 with Prem Pro compared to earlier generation GPUs

    https://www.pugetsystems.com/labs/articles/GTX-1070-and-GTX-1080-Premiere-Pro-Performance-810/

    And a pair of articles looking at the Quadro and GTX1080 with 3DsMax

    https://www.pugetsystems.com/labs/articles/AutoDesk-3ds-Max-2017-Quadro-GPU-Performance-822/

    https://www.pugetsystems.com/labs/articles/AutoDesk-3ds-Max-2017-GeForce-GPU-Performance-816/

    Happy reading. :-)