Skip to main content
richardj21724418
Known Participant
January 10, 2017
Question

30-bit GPU for Photoshop CC

  • January 10, 2017
  • 2 replies
  • 8062 views

I realise similar questions have been asked in the past but I'm asking again in case the situation has changed.

I'm specifying a new Windows 10 PC, primarily for photo editing with PS CC, and am considering using the GTX 1050Ti GPU. My main monitor supports 10-bit per component (30-bit per colour) via the DisplayPort and I'd like to be able to utilize this colour depth from PS.  Is this possible with the GTX 1050Ti?  I note an NVIDIA Support answer ID 3011 from 2011 which states Photoshop renders through OpenGL and OpenGL 10-bit per colour buffers require a Quadro workstation GPU.  Is this still the case, or do GeForce GTX drivers now support OpenGL 10-bit buffers?

Similarly, if I were to choose AMD would I need a Firepro workstation GPU, or could I now get away with a Radeon card?  (I note the AMD Catalysts Control Center has provision to set 10-bit HDMI output even on my ancient HD5570, but that doesn't mean it's supported in OpenGL).

If the answer is still that I'll need a 'workstation' grade GPU, would the Quadro K1200 be a sensible choice?  I'm planning on running two screens, the main one up to 4k resolution and the other at 1920x1200.

    This topic has been closed for replies.

    2 replies

    D Fosse
    Community Expert
    Community Expert
    January 13, 2017

    You seem to have researched this thoroughly, so I don't have much to add.

    Yes, the practical divide here is between hardware calibration vs. video card calibration. In the first case all adjustments are performed in high bit depth internally in the monitor (16 bits for the Eizo), and so output to the panel is perfectly linear with 256 evenly spaced steps.

    So it's a bit ironic that the monitors that could really benefit from 30-bit color, generally don't support it. Those that do, don't need it.

    richardj21724418
    Known Participant
    January 18, 2017

    In the end I managed to pickup a Quadro K1200 for much the same price as a GTX 1050Ti (it was pulled out of a new HP Workstation).  It now gives me 1024 nice even steps in Photoshop CC, but only at zoom factors of 66.7% and above.  With a reasonable monitor you can still see the individual steps quite easily, but they are obviously a lot less noticeable.

    Strangely Photoshop only appears to render 30-bit colour in 'Standard Screen Mode' and 'Full Screen Mode' but reverts to 8-bit per component in 'Full Screen With Menu Bar' mode.  Another weird quirk.

    My feeling is 30-bit colour may be helpful occasionally but it's not really worth spending a lot on.  Better put it towards a monitor with decent internal calibration, so you don't suffer the banding effects introduced by the GPU LUTs.

    D Fosse
    Community Expert
    Community Expert
    January 10, 2017

    richardj21724418 wrote:

    Photoshop renders through OpenGL and OpenGL 10-bit per colour buffers require a Quadro workstation GPU. Is this still the case

    Yes, that is still the case. For AMD, a FirePro model is required.

    That said, it still may not work. I have a Quadro driving an Eizo through DisplayPort, and 10-bit color worked for the first 24 hours after installing the card, before it just went poof and permanently reverted to 8-bit mode. Haven't updated the driver in a while, though.

    Not that I worry about it. With a high quality monitor you really can't tell the difference in any real-world situation. You need special test gradients to see it.

    richardj21724418
    Known Participant
    January 13, 2017

    D Fosse wrote:

    Yes, that is still the case. For AMD, a FirePro model is required.

    That said, it still may not work. I have a Quadro driving an Eizo through DisplayPort, and 10-bit color worked for the first 24 hours after installing the card, before it just went poof and permanently reverted to 8-bit mode. Haven't updated the driver in a while, though.

    Not that I worry about it. With a high quality monitor you really can't tell the difference in any real-world situation. You need special test gradients to see it.

    Thank you for your reply.  I've also heard back from NVIDIA saying if I want 10bpc OpenGL support then I'll need a Quadro card.

    I think you're right about not being able to notice the difference in most real-world situations.  However, on my existing 8bpc setup I can clearly see the steps on grads and perhaps banding is just visible in skies and skin tones.  It may be a bit more visible on my current setup, being as it's wide gamut.  I know 10bpc is not a requirement for wide gamut displays but obviously with wide gamut every step affords a somewhat greater tonal significance.

    As you might expect, grads created on Photoshop at 16bpc look better when I switch to 8bpc (provided I've enabled dithering in the 'Color Settings' options).  Also, grads created at 8bpc look less stepped than those created at 16bpc, provided I've enabled dither in the grad options.  It shows the effectiveness of dithering and why processing images at 16bpc is important, even if the final output is 8bpc.

    I think in the real-world other factors are more important, such as how the monitor is calibrated.  If the monitor is hardware calibrated it seems to me you don't need the GPU LUT and, unless you want to use Photoshop 30-bit display, the GPU only needs to support 8bpc output.  In other words, any modern GPU will do, provided it can support the required resolution and modest frame rate.  If the monitor calibration is handled by the PC then the GPU ought to have a LUT output depth of at least 10bpc.  I doubt it matters much visually between conveying this to the monitor over a 10bpc link or via 8bpc with dithering.  After all, it's much the same as a 10bpc display employing FRC on an 8bpc panel.  I currently have ancient AMD HD5570 cards and suspect the LUT output is only 8bpc because the banding and colouration I get in monochrome images after calibration is awful.  Whatever I end up with can't be worse than this!

    Another reason for not getting too worked up about the Photoshop 30-bit display is the way layer processing reverts to 8bpc when images are viewed below 66.7%.  I've not looked into why Photoshop does this but I suspect it may date back to when processors weren't nearly so powerful.  It would be nice to have this as a user defined threshold.  Perhaps an advantage of 4k and higher screens it that we can don't have to view below 66.7% so often!