• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

nVidia graphic card for 10 bit/color in Photoshop

New Here ,
Oct 22, 2016 Oct 22, 2016

Copy link to clipboard

Copied

Hello,

Please tell me what GPU/graphic card do I need to use 10bit/color in photoshop?

I own a 10bit/color photography display and I want to use it at full pottential.

I don't do other proffesional stuff (CAD, 3D, etc), only photography in Photoshop.

Is there any GeForce card that can help me? or only Quadro cards?

I will appreciate your quick reply.

Best regards,

Daniel.

Views

31.9K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe
Community Expert ,
Oct 22, 2016 Oct 22, 2016

Copy link to clipboard

Copied

It has to be a Quadro. I just got a Quadro K420, which btw isn't particularly expensive. The first day 10-bit display worked right out of the box, didn't have to do anything but flip the 30 bit switch in PS preferences.

The next day it would only work at 66.67% zoom and up.

The third day it wouldn't work at all, and hasn't since.

Not that it matters much. With a good monitor you can't tell the difference, unless you pull up very special test gradients. With an actual photograph there's always just enough noise so that it's impossible to discern any banding, even in the smoothest gradients.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Oct 22, 2016 Oct 22, 2016

Copy link to clipboard

Copied

And besides the 8 / 10 bit difference between geforce and quadro, there are other differences in photoshop between those two?

Wich one will help the CPU to render zoom, blur, etc?

Any idea?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Oct 22, 2016 Oct 22, 2016

Copy link to clipboard

Copied

Chetroni wrote:

And besides the 8 / 10 bit difference between geforce and quadro, there are other differences in photoshop between those two?

Wich one will help the CPU to render zoom, blur, etc?

Any idea?

Puget Systems is my go to site for that sort of information.  They don't have any recent articles pertaining directly to Photoshop GPU performance, but there are some interesting reads there.

This is the most recent article I found that is fully relevant  (September 2012)

https://www.pugetsystems.com/labs/articles/Adobe-Photoshop-CS6-GPU-Acceleration-161/

Photoshop memory optimization

https://www.pugetsystems.com/labs/articles/Adobe-Photoshop-CS6-Memory-Optimization-182/

Premiere Pro GPU acceleration

https://www.pugetsystems.com/labs/articles/Adobe-Premiere-Pro-CC-Professional-GPU-Acceleration-502/

GTX1080 and GTX1070 with Prem Pro compared to earlier generation GPUs

https://www.pugetsystems.com/labs/articles/GTX-1070-and-GTX-1080-Premiere-Pro-Performance-810/

And a pair of articles looking at the Quadro and GTX1080 with 3DsMax

https://www.pugetsystems.com/labs/articles/AutoDesk-3ds-Max-2017-Quadro-GPU-Performance-822/

https://www.pugetsystems.com/labs/articles/AutoDesk-3ds-Max-2017-GeForce-GPU-Performance-816/

Happy reading. 🙂

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 25, 2017 May 25, 2017

Copy link to clipboard

Copied

According to this article, Nvidia GeForce cards now support 10 bit color.

Nvidia Confirms GTX 1070 Specs -1920 CUDA Cores & 1.6Ghz Boost Clock At 150W

May 17, 2016

The cards also finally, a first for any GeForce products, support 10-bit per color channel. Something that the Radeon counterparts had for years. So if you have a color rich 10-bit per channel monitor you’ll finally be able to enjoy it on GeForce. Blacks will be absolutely black, reds greens and blues incredibly more accurate. This includes support for upcoming High Dynamic Range monitors as well.


So I think you'd be safe with the GeForce 1070 and later versions.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 26, 2017 May 26, 2017

Copy link to clipboard

Copied

Something that the Radeon counterparts had for years. So if you have a color rich 10-bit per channel monitor you’ll finally be able to enjoy it on GeForce. Blacks will be absolutely black, reds greens and blues incredibly more accurate. This includes support for upcoming High Dynamic Range monitors as well.

Er...this is obviously written by someone who has no clue whatsoever what 10-bit support is in the first place.

"Blacks will be absolutely black, reds greens and blues incredibly more accurate" Huh? Say again? That has absolutely nothing to do with bit depth - none, zero. All you ever get with a 10 bit display pipeline, is smooth, stepless gradients. That's all there is to it.

And no - Radeons have not had 10 bit support for years. They don't support it now, and never have. Only the FirePro line ever supported 10 bit depth.

And what an HDR monitor is, is anybody's guess. It would certainly be of no use in proofing for print, where today's monitors already have far too much dynamic range, so that you have to dial it down considerably to be of any practical use.

In short - this isn't very credible.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 26, 2017 May 26, 2017

Copy link to clipboard

Copied

Going to the specs released from NVidia, there's no mention of 10 bit support there, anywhere. They probably confused it with something else with the number "10" in it.

I got a bit curious about what an HDR monitor would be. The specs promised include a brightness level of 1000 cd/m² !!! Have your goggles ready, gentlemen. That's suntan-level radiation. I wouldn't want to be within a fifty feet radius of such a monster....

And a matching contrast ratio of 10 000 : 1. Yes, that's ten thousand to one. Since a premium-quality inkjet print has a contrast ratio of less than 300 : 1, this is bound to improve your work no end, I'm sure.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 26, 2017 May 26, 2017

Copy link to clipboard

Copied

....that's disappointing. LOL

A friend of mine uses an ASUS GeForce 1070 for his photography work - you'd wouldn't know it has such color limitations.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 26, 2017 May 26, 2017

Copy link to clipboard

Copied

This is all aimed squarely at the gaming crowd. Heck, they might even enjoy an HDR monitor.

For photography and general imaging work, it's like charging your cell phone with the Large Hadron Collider at CERN.

As for 10 bits per channel display - it's nice, as in "look at these smooth gradients, that's way cool", but it's not essential in any way. It doesn't improve your work. In almost all possible real-life scenarios you can't tell the difference. If you can, it's a sign you should stop looking at test ramps and try to get some real work done.

My own Quadro/Eizo combo delivers 10 bit display very erratically. Sometimes it works, sometimes it doesn't. But I have to look for it, and most of the time I don't even think about it. There are more important things to be concerned about, like getting a good screen to print match. And that's totally irrespective of bit depth.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 31, 2017 May 31, 2017

Copy link to clipboard

Copied

D Fosse,

Thanks for the info! I was about to "upgrade" from an 8-bit GTX 750 ti (one fan) to a Quadro K620 or GTX 1060 something to get 10bit since I just got a 100% sRGB 10-bit monitor. But after reading the comments here and your knowledgeable background, I'll probably just stick with the 750ti.

What monitor do you use for photography? I had the LG IPS 32" Amazon.com: LG 32MA68HY-P 32-Inch IPS Monitor with Display Port and HDMI Inputs: Computers & Accesso...

It was nice but I wanted higher resolution with better blacks/contrast so I went to this Phillips 32 inch Amazon.com: Philips BDM3270QP2 32 inch class LED-Lit monitor, 2560x1440 res, 4ms, 50M:1DCR, VGA, DVI...

It's an AMVA screen. Almost no light bleed, but does have some glare so I end up having to turn lights down which isn't a bad idea anyway.

But now I'm thinking this 4K AOC 32 inch Amazon.com: AOC U3277PWQU 32-Inch Class VA LED Monitor, 4K (3840x2160), 4ms,VGA/DVI/HDMI-MHL/DP/USB ... 

It dropped in price just this week. But not sure my 750ti is good enough. And resolution might be too much for LR even with larger text size bumped up in LR. But I'm really trying to find a good monitor to match prints. I've tried half a dozen print processors using their ICC profiles, sRGB, etc. but none are that close. Maybe 80% close. I did calibrate using Spyder 5 (xrite wouldn't work) and DisplayCal.

Thanks!

Michael

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Jul 26, 2017 Jul 26, 2017

Copy link to clipboard

Copied

Chetroni,

The only person who came close to answering your question was D. Fosse except I disagree with his one comment regarding that link to the GTX 1070. :

"Er...this is obviously written by someone who has no clue whatsoever what 10-bit support is in the first place."

That's not enough to describe what was said there D. Fosse, I would reword your comment and make it shorter, something like:

"Er...that article was written by an idiot."

Also, Michael, you selected a nice monitor, but I don't think you listened to D. Fosse that closely. I use three Workstations, one of which I use for Adobe Apps, the second is in my office and the third is my Home Office. Except for the Adobe WS (full specs in my profile) the other two are relatively new builds, moving over to the LGA 2011 V3 from LGA 2011 (I had to make that move for Adobe almost two years ago, but at least I was able to use the same Motherboard.

My Adobe WS uses a QUATRO K4200, which is a true 10 bit GPU and it delivers 4K 1.07B Colors with no dithering on a 10 bit monitor. Below are the specs from my the GPUs I have in my other two WS's which I use for mostly ERP, Managing parts of our VMware Business and maybe once a week firing up a game (on my Home WS which has two of the below SLI'd). .

ASUS STRIX GeForce GTX 1080 OC Ti:

CUDA Cores: 3584

Gaming Mode

Base Clock: 1569 MHz

Boost Clock: 1683 MHz

OC Mode

Base Clock: 1594 MHz

Boost Clock: 1708 MHz

Memory Speed: 11 Gbps

OC Mode Speed: 11.1 Gbps

Configuration: 11 GB

Interface: GDDR5X

Interface Width: 352-bit

Bandwidth: 484 GB/s

Microsoft DirectX: 12

OpenGL: 4.5

Vulkan: Yes

If you think you're going to get 10 bit by going into the NVIDEA Control Panel and changing the 8 bit to 30 bit, on your 750 ti and that it will deliver what I think you're looking for, you may want to seriously consider a QUADRO. Otherwise, you're getting your 10 bit with Windows API's, hello dithering and worse.

Here is the rub:

Applications like Adobe Premiere Pro and Photoshop, etc., use OpenGL 10-bit per color buffers (True 30 bit) and that mandates an NVIDIA Quadro GPU with a DisplayPort adaptor (I left out AMD's Fire, since I don't like AMD and beside, we're an Intel reseller. 😉

The GPX Line of cards are great for people who play games all day, and I have found them to be an cheap replacement for more expensive cards working with Mission Critical Business Applications, assuming you build a WS with a CPU that can also easily overclock and remain stable, and the storage controller array uses the new Mini (SFF) SAS SSD's which are 12 GB/s (Especially when using a WS to push out VDIs to clients, analytics and Financial Modeling).

But when you wish to create high quality content, that is not the sweet spot for got the GTX line. Frankly some if the QUADROS are cheaper, you simply need to know exactly what you're going to produce and work with. Good luck.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Aug 17, 2017 Aug 17, 2017

Copy link to clipboard

Copied

Oof, "isn't very credible" - careful not to fall carrying that sword.

Radeons do, and have been supporting 10-bit. It turns out you're the one not very sure what they're talking about. Sorry man.

So will AMD enable 10 bit color output for Radeon cards, since HDR requires it? : Amd

^Here's a quick link to a thread where it's mentioned to get you started - you certainly make all the sounds of a very very smart person so I'm sure you can google up the rest of the info yourself.


You'll probably first cry "but why firepros then??"
It's complicated. But yes, they both support 10-bit. This has been checked. Just not by you.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Aug 17, 2017 Aug 17, 2017

Copy link to clipboard

Copied

You can save the sarcasm. There is no Open GL 10 bit support for Radeon, only FirePro.

I did learn recently (see above) that there is DirectX 10 bit support in NVidia GeForce and I assume the same in AMD Radeon. But DirectX is not used in any major graphic/CAD application. These applications use OpenGL.

So for Photoshop, it has to be a FirePro. Radeon won't do it.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Nov 17, 2017 Nov 17, 2017

Copy link to clipboard

Copied

Actually, Radeon cards have supported 10-bit for a while now, and not just DirectX. I'm running an r9 390 in my current rig and feeding 10-bit signals to both my projector and my PV270. I can verify by opening the AMD Crimson control center and verifying that the option of '10 bit' is selected, from the options of 8-bit, 10-bit, and 12-bit.

You can verify my graphics card and the color depth, 10-bit selection below.

Capture666.PNG

NOW, the issue here is that you need a FirePro or Quadro card for programs like Photoshop or Premiere to display the 10-bits - because these programs use buffering - but that is beyond the point that the standard Radeon cards DO support 10-bit color (and even 12-bit color) natively. GeForce cards do not (well, they do in DirectX, but that is a whole other ballgame).

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Nov 17, 2017 Nov 17, 2017

Copy link to clipboard

Copied

I think we just need to be very clear about this, and not mislead anyone. If you want 10 bit display in Photoshop, you need a FirePro or a Quadro. That's the long and short of it.

The finer technical points aren't really important - because this isn't about technical limitations, these are policy decisions and market segmentation. All these cards could probably do it if AMD and NVidia wanted them to, but they don't.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Nov 17, 2017 Nov 17, 2017

Copy link to clipboard

Copied

Right, and I agree - except that your comments make it appears as if non-FirePro and Quadro cards do not do 10-bit at all, which they do. That is what I want to clear up. Rather the 10-bit can be used in every program or not is dependent on the programs and its 10-bit implementation. Adobe has chosen to use buffering, which is only a feature of the pro-line drivers.

But yes, the Radeon line of cards can do standard 10-bit in OpenGL/OpenCL. But as you say, that alone is not enough for Adobe products.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Nov 17, 2017 Nov 17, 2017

Copy link to clipboard

Copied

There is also another way to get 10-bit native from Photoshop and other Adobe applications - The Black Magic Mini Monitor card. Coming in at less than $130 new, this card will work with Photoshop to deliver 10-bit, at least according to Blackmagic Design's website. It's of course not a true GPU/Graphics card, and I'm unsure of how it works with Photoshop since I only use my BM card with Premiere and AE, but it should work.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Aug 12, 2017 Aug 12, 2017

Copy link to clipboard

Copied

Just a small update here. I built a new Win 10 Pro machine, and migrated the Quadro K420 from the old system. This card is low-end as Quadros go and not particularly expensive, about the same as a GTX1050. It only has one DP and one DVI port, though.

As far as I can tell from about a week's use, 30-bit display now works reliably and consistently in Photoshop. Using the gradient tool I can immediately see the difference between 8 and 16 bit files on the Eizo. You still need to be at 66.67% zoom or above, but other than that all I needed to do was tick the 30-bit box in PS preferences.

On the old Win 7 system it was very erratic. It sometimes worked, but mostly not. I did download a fresh driver this time, 385.08, and installed only the base driver leaving everything else unchecked (in case that matters).

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Aug 12, 2017 Aug 12, 2017

Copy link to clipboard

Copied

Excellent point D Fosse. You need not break the bank for ma 30 bit GPU or Monitor these days. You're also correct about Win 7 being a constant challenge with NVIDEA drivers.

"Using the gradient tool I can immediately see the difference between 8 and 16 bit files on the Eizo."

Amazing how obvious it becomes and you will see a difference in the work.

That annoying 66.7% is a driver issue and that really irks me to no end. I went what people call the "low to mid GPU for Adobe" and can go down to 16% with the QUATRO K4200. Your K420 should be able to do the same for zoom at 10 bit, but you might get a bunch of techno babble thrown at you, which, upon translation becomes "upgrade your GPU!"

Good luck,

Dean

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Aug 13, 2017 Aug 13, 2017

Copy link to clipboard

Copied

Hi Dean

I doubt that any driver would overcome the 66.7 % zoom issue in Photoshop. It is caused by Adobe using 8 bit previews to blend at zoom levels below that.

Probably a very sensible decision made years ago to speed up screen redraw but it does cause all kinds of issues even when viewed on 8 bit monitors as tje blending between layers is changed.

This came out of a discussion with some very knowledgeable folk who used to post here. There is probably a case for doing it differently now and at least one feature request was submitted (wish I could find it) but I would dread to think what kind of rewrite it would involve as blending previews really is at the core of Photoshop.

Dave

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Aug 13, 2017 Aug 13, 2017

Copy link to clipboard

Copied

Dave is that related to the same blend mode (Mac and Open GL) issues that make it impracticable to improve workpath visibility?  Scroll down to Chris Cox's comment on the Feedback site link below.

path visibility

Photoshop: Please let us change the Pen tool path color | Photoshop Family Customer Community

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Aug 13, 2017 Aug 13, 2017

Copy link to clipboard

Copied

I'm not sure whether those are directly related Trevor. My most recent discussion on this was with Noel - who, as you know, develops plug in software and explained why we see some of the effects that we do in previews.

Dave

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jul 29, 2019 Jul 29, 2019

Copy link to clipboard

Copied

LATEST

NVIDIA DRIVERS NVIDIA Studio Driver

Finally: Adds support for 30-bit color on GeForce and TITAN GPUs

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines