Skip to main content
Known Participant
September 11, 2016
Question

Does the GTX 1060 supports 10-bit output in Photoshop?

  • September 11, 2016
  • 2 replies
  • 9074 views

Hi!

I finished my new PC, with i7 6700K and 30bit monitor. The only thing I have to buy is the video card.

My question is: do the new gtx 1060 videocards support 10bit output in photoshop?

I know that PS doesn't make intensive use of GPU, so I don't want to spend much money for Quadro series.

On nvidia site, they acclaim that all new GTX 10x0 cards support 10 bit mode.

Anybody has tried it?

This topic has been closed for replies.

2 replies

Participant
September 24, 2016

Curious as well...all the new 10xx cards supposedly support 10-bit. Have a prosumer monitor that supports dithered 10-bit (asus ips monitor PB279Q)...

RoninEdits
Inspiring
September 11, 2016

i think the gtx 1000 series still limit 10bit to directx, so it may not work for 10bit photoshop or adobe software with opencl. maybe someone that has the hardware to confirm/test will also comment.

Bill Gehrke
Inspiring
September 11, 2016

That question probably should be asked in the Photoshop forum.  How would one test to see if it really works?

Known Participant
September 12, 2016

the photoshop forum probably would have more folks that could answer. how to test: with the hardware (gpu+ true 10bit monitor), one way would be a basic grayscale gradient. 8 bit would create banding, 10 bit would show none.


I will try there. Because I have two choice. Buy the new GTX 1060 for 200 euros, or the old Quadro K620 for 100 euros (second hand).

I won't use this workstation for gaming, but i built it just to use Photoshop and Capture One.

At the moment i couldn't spend more.

i was wondering because the "slow" macbook 12" 2016 edition that I own, supports 10 bit mode in Photoshop with its intel 515.

Of course the monitor isn't a true 10bit monitor, but the difference is really big and unexpected comparing to a macbook pro 15" (with nVidia GT 650M)!