• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

gpu

New Here ,
May 07, 2023 May 07, 2023

Copy link to clipboard

Copied

i have no idea what gpu to buy. im and amateur photoshop user and am using my integrated graphics for now. looking for something not to expensive.

TOPICS
Windows

Views

2.0K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 07, 2023 May 07, 2023

Copy link to clipboard

Copied

How do you use Photoshop? Do you shoot Raw files and therefore use Camera Raw?

 

The reason I ask is that not so long ago, I would have advised that almost any reasonably recent GPU would suffice. The minimum system requirements state Direct X 12 support and 1.5 GB VRAM and recommend 4GB VRAM.

https://helpx.adobe.com/uk/photoshop/system-requirements.html

 

However that does not tell the full story. More and more functionality is being moved toward the GPU which is designed for the parallel processing required in large images. We are seeing issues with older GPUs raised in these forums. In addition the speed of the GPU can make a massive difference to some functions. The recently added Noise Reduction in camera raw make good use of GPU functionality and processing times can range from over an hour on older GPUs to just a few seconds on very recent and powerful GPUs. I recently carried out a test on a 60MP image that took 13 seconds to process on an RTX 3090 and much longer on lesser GPUs.

 

Don't get me wrong, I am not advising folk to rush out and buy an RTX4090 for Photoshop, what I am saying is that how much return you get from a GPU depends on how you use Photoshop.

 

Dave

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 08, 2023 May 08, 2023

Copy link to clipboard

Copied

I think this is the time to think future-proof. By all appearances, we are now at the threshold of one of those major technology shifts, and the focus of that shift is the GPU.

 

The new Denoise feature in ACR/Lightroom is probably just the introduction. More will come. The thing about Denoise is that it runs exclusively in the GPU, and it relies heavily on new technologies that have been on the market only the last two or three years.

 

What that translates to is that it doesn't perform well on older GPUs. Older here means older than the RTX 30-series and RTX 40-series that are mainstream now. I've seen indications that even the RTX 20-series may fall a bit short. The difference is on the order of 20 seconds versus 2 minutes or even 20 minutes in some cases.

 

What I can confirm personally is that the RTX 3060 performs very well, at a very moderate price, and then it continues to perform even better as you move up to -70 -80 and -90. So you have a range from reasonably priced to eye-watering expensive. But any of these will be good.

 

I would not recommend anything from the older GTX series.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 07, 2023 May 07, 2023

Copy link to clipboard

Copied

What means not too expensive? Pricing also depends on part of the World and whether you are keen to buy used GPU. I believe you can buy for $230-250 new RTX 2060 which does good job for now. To be more secure and to increase lifespan you can buy for the same or less money used RTX3050. Keep in mind that its easy to add/remove GPU on Desktop while on laptop I am not sure. You have to watch and some system specs because it is waste of money to use PCI 4 on PCI 3 capable system if you do not plan to upgrade. RTX 2060 is PCI 3 while RTX 3050 is PCI 4. Other than that you may need to check your power supply whether it can run system with additional GPU and whether it contains necessary cable to plug in GPU as additional power supply beside PCI slot which is enough for weak cards like 1050 or even 1650 but not above that.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 08, 2023 May 08, 2023

Copy link to clipboard

Copied

I was just in the same situation. I was looking at the RTX 40 series, but it was a bit too pricey. I settled for a RTX 3080ti, which is still not cheap, but I've been very happy. On my old computer, I could not do the IA replace, and the ACR denoise's estimated time was 44 min. Now it's 30 sec. Just a word of warning about integrated GPUs: the updated drivers are done by the computer company and not the video maker - at least that used to be the case. And some have switchable GPUs. I didn't realize my new computer had that. I had toake sure it used the dedicated card, as the general card took 2 min for the ACR denoise, rather than the 30 sec, for the Nvida card.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 08, 2023 May 08, 2023

Copy link to clipboard

Copied

I learned basic about GPU and other computer units one year ago, just to say that in advance. I can not estimate nor measure nor compare GPU units for Photoshop, I have no idea about development but I think we are going tooo far. RTX 3080 is heavy weapon for gamers, it can eat any modern game. I do not believe Photoshop will require that kind of processing power in near feature.  Especially not for Denoise what is not that complex task, in my mind at least. For now everything works just fine when processing 18MP raw files captured by my Canon camera in ACR on RTX 2060. It is estimating 10 second to process single file. There are 48MP camera I know but that should not make that huge difference. If someone has Ryzen 5600G and Intel 12600K I am interested to learn whether it works with Denoise because I am a bit lazy to pull out GPU and test since nVidia is killing my 5600G integrated card and I can not use it (I asked questions on Ms forums about conflict and how to deal with two GPU units). I really think that mentioning RTX 40 series and numbers like 80 and 90 is overkill and sounds not realistic even for 3-4 years feature. Maybe it is realistic for 8K video processing and 3D tools but for Photoshop? 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 08, 2023 May 08, 2023

Copy link to clipboard

Copied

Well, what I said was RTX 3060 and up.

 

I have an RTX 3060 and it runs denoise very well. That's not an expensive card at all. But if you want lightning fast, and have the money to spend, an RTX 3090 will do that.

 

But nobody's saying you need these high-end expensive cards. What I have been saying is that you need a fairly new card, because these technologies are new on the market..

 

Denoise is in fact something entirely new. It's the first feature in the PS/ACR/Lr family that runs entirely in the GPU. It doesn't touch the CPU at all. And there's little reason to think it will stop there.

 

Whether denoise takes 10 or 20 seconds is of little practical consequence. But when you're looking at two minutes, ten minutes, forty minutes - then it's beginning to be significant. And the old cards, GTX and similar, are all basically there.

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 08, 2023 May 08, 2023

Copy link to clipboard

Copied

Yea, my old card was a GTX, and it served me well for about 12 years, so I'm hoping that I can stretch this new computer out for a long time and make the extra expense pay for itself.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 08, 2023 May 08, 2023

Copy link to clipboard

Copied

New things from my perspective are DDR5, PCI5 (it followed PCI4 very quickly) and doubled speed for GPU units. You can say there is major shift in technology. It will pay if you buy now high end but... With each new release prices drops down heavily. What cost now 1000 will cost probably 500 in two years. Newest technology needs time to be exploited fully by software and other things like games. For average Photoshop user on budget (or not) I believe it is wise to wait for upgrade (if they can) which means system upgrade because... If you are about to purchase heavy weapon like 4080, 4090 then go with nothing less then DDR5, and PCI5 capable motherboard and components which are new and guess what? We have mostly mentioned GPU units here because they are on demand and in most cases most expansive component and for that reason starting point for many buyers. Will they pay back immediatelly? For average user I guess not, its simply spending without much gain when it comes to most expensive component. You will be glad you purchased 4090/4080 over 3060 in 5 years from now I am sure but  same apply if you purchase that system 2 or 3 years from now, thats my point. For anyone looking to upgrade immediatelly your advice for GPU seems very healthy with attention to PCI and DDR system standards. PCI standard is backward compatible while DDR is not for anyone who is patient enough to read and crazy enough to not check what is writen.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 08, 2023 May 08, 2023

Copy link to clipboard

Copied

I just tested the same 60.2MPixel raw file on three PCs here with different GPUs.

 

1. A desktop with an RTX 2080ti at 1.6GHz. Denoise took 25 seconds

2. A laptop with an RTX 3070 at 1.2 GHz. Denoise took 35 seconds

3. A desktop with an RTX 3090 running at 1.8GHz. Denoise took 13 seconds

 

All three are capable - the laptop surprised me compared to the older GPU but perhaps it should not, the GPU is clocked slower and, to avoid heat issues, will not boost in the way desktops do.

 

In single cases a few seconds here or there for Denoise is irrelevant. If you were processing 50-100 images at a time then the time might be significant. Hence the point in my earlier post. the return on a GPU depends how you use Photoshop.

 

Dave

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 08, 2023 May 08, 2023

Copy link to clipboard

Copied

...and a 60 MP file here, on a base RTX 3060, is 33 seconds.

 

but! here's the point I'm trying to get across:

 

On my other machine, with a Quadro P2200, that same file is 2 minutes and 15 seconds. That GPU was released in 2019, the same generation as (and comparable to) something like a GTX 1060 or 1070.

 

That's the difference between new and old GPUs. That's what I'm getting at.

 

Now, 60 MP is a pretty big file, so for most people those timings will be considerably lower. This is consistently proportional to MP count.

 

It's also very consistent with a large number of timings reported in the Lightroom Classic forum, where this has been extensively discussed (much more than here in PS or ACR). The pattern is very clear: new GPUs do it in 10 - 30 seconds; old GPUs do it in two to ten minutes.

 

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 08, 2023 May 08, 2023

Copy link to clipboard

Copied

Ok, thats something! From what I can see in specifications, average camera output dimensions are about 24MP. Here is example Canon EOS R3 which is considered profi tool by Canon https://www.canon.rs/cameras/eos-r3/specifications/

 

I think we can do some real world test for features which runs entirely on GPU. Someone can provide 24MP raw file then we can test on my RTX 2060, @D Fosse on RTX 3060, @davescm on several different and @Chuck Uebele on RTX 3090 to see what real world difference or impact is there.

 

I am capturing 200-300 images per week. Most of them I do not process, in fact many of my last 3 year images (count around 100.000)  are deleted due to various reasons after evaluation or stored without processing them. Thats me, of course but I am user too. So my process is to evaluate then process only high rated images. If there is 5-10 or even 20 second difference and we are allowed to continue work while denoise is doing its job in the background then I do not think thats something to think about for most of the users. I can imagine that only professional service that works for those who are capturing celebrations have need to process hundreds or even thousands of images per day.

 

When it comes to laptop versus desktop I am not surprised at all. Thats why I switched and bought desktop. Statement that I read before deciding says that laptops are using previous generation processors and GPU because problem with heat. Thats may not be true for latest releases although newest processors and GPU units are consuming lot of power and probably producing lot of heat so they can not cool that in small laptop space. I haven't heard about major shift in laptop cooling.

 

When it comes to Apple products, I read online on several pages that M1 GPU compares to RTX 2060 and thats why I decided to buy RTX 2060. Here is another comparison which does not mention RTX 2060 at all (instead it mentions 1660Ti which is similar in compute power) but it does provide some data and comparison https://appleinsider.com/articles/21/10/19/m1-pro-and-m1-max-gpu-performance-versus-nvidia-and-amd

"At the high end, the M1 Max's 32-core GPU is at a par with the AMD Radeon RX Vega 56, a GPU that Apple used in the iMac Pro. Its Nvidia equivalent would be something like the GeForce RTX 2060."

When compared RTX 2060 and M1 using PugetBench testing tool there is significant advantage or better score on M1 side but thats something else because it is measuring everything: processor, RAM, SSD and GPU.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 10, 2023 May 10, 2023

Copy link to clipboard

Copied

That would be interesting to do a group test.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 19, 2023 May 19, 2023

Copy link to clipboard

Copied

As an amateur Photoshop user, purchasing a dedicated GPU can drastically enhance your user experience. For a budget-friendly option, I recommend the Nvidia GeForce GTX 1650. It provides excellent performance and speed for its price, and is a great upgrade from integrated graphics.

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 19, 2023 May 19, 2023

Copy link to clipboard

Copied

LATEST

That's correct, but in that case you should go to the current generation RTX 3050 (or higher if budget allows).

 

Many new features in PS/ACR/Lr rely on technologies such as Tensor cores, only available in the newer generation GPUs. It will still work in older GPUs from the GTX series, but it will be orders of magnitude slower. Denoise, for instance, will be five or six times slower on an older GTX.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines