Exit
  • Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
  • 한국 커뮤니티
6

GPU Benchmark Denoise AI

New Here ,
May 09, 2023 May 09, 2023

Hello, do you have any GPU speed benchmarks for denoising? Does Lightroom also support AMD graphics cards in the calculation or only Nvidia? I'm particularly interested in the Radeon 7900 XT. I'm not entirely satisfied with the RTX 4070.
Best regards, Mark

TOPICS
Windows
37.8K
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jan 26, 2025 Jan 26, 2025

Good morning
So a Radeon 7900XTX can be a good upgrade solution for my gtx1080ti, knowing that I'm on an AMD processor (threadripper 1950X) and my camera is 60Mp (A7R5)

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Jan 26, 2025 Jan 26, 2025

If you want the AI-performance less than a NVIDIA RTX 4070 Super provides then go for a AMD 7900XTX. In the price range of an AMD 7900XTX I would go for an RTX 4070 Ti Super or the upcoming RTX 5070 Ti.

Adobe AI denoise example:

RTX 4070 Ti Super: 31 seconds

7900XTX: 35 seconds

See the post from reedsmith and his later #399 post.

 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jan 31, 2025 Jan 31, 2025

So basically, more CUDA cores means a faster denoise process. I’m hoping we’ll get more data on the 50xx series soon, so we can see if additional AI enhancements, along with the extra CUDA cores, will improve performance. I’ve already planned my PC build, but I can’t decide on the GPU just yet. AI denoise is a crucial part of my workflow, as I often denoise a large number of images before I start editing, so time is definitely of the essence.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Jan 31, 2025 Jan 31, 2025

@elisir 

Not CUDA cores. Tensor cores.

 

CUDA is actually an API (application programming interface), or an architecture, that lets a high number of cores run in parallel. The number of cores in this parallel architecture can be hundreds and thousands. CUDA cores are general purpose.

 

But that's not what is used by Denoise and other AI-based tasks (or only partially). This uses Tensor cores, which are specifically purposed for AI.

 

The RTX designation indicates Tensor cores. Earlier GTX models didn't have that, and that's why they're orders of magnitude slower with Denoise and AI.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jan 31, 2025 Jan 31, 2025

Thanks for the clarification! So it’s the Tensor cores I should focus on when choosing a GPU for AI denoise, not the CUDA cores. 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Oct 02, 2024 Oct 02, 2024

The RX 7900 XT is a beast.

24mpx (35.74MB) = AI DENOISE 4Sec

50.3mpx (64.02MB) = AI DENOISE 10Sec 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Oct 02, 2024 Oct 02, 2024

You are probably having an AMD based computer?
Also the files you tested are not the same ones used on the list above - that might give or take some seconds 🙂


--
// Karo Holmberg, Loupedeck/Logitech (PO: Photoshop, Lightroom Classic, Illustrator, Capture One)
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Oct 02, 2024 Oct 02, 2024

That's about similar to an RTX 4070 Ti (extrapolated from given sizes to 60MP, which was the original test standard).

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Feb 03, 2025 Feb 03, 2025

Here are some new results on actual denoising with different graphics cards.

All of these values have been measured with Lightroom Classic, using 50 strength on Ai Denoise. All tested images are the same, and clock has been stopped at the same time as the "process bar" disappears on Lightroom Classic's UI (so not at the point when "image is ready", but when process bar disappears).

Is this scientific, hell no - but it gives quite a good estimate where we are with current cards 🙂

No 50-series NVIDIA cards tested yet on this particular test.



--
// Karo Holmberg, Loupedeck/Logitech (PO: Photoshop, Lightroom Classic, Illustrator, Capture One)
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Feb 03, 2025 Feb 03, 2025
quote

No 50-series NVIDIA cards tested yet.

By @Karo Holmberg Logitech

One RTX 5080 was already tested in my already several times linked forum , result:

19 seconds

as compare

1) RTX 4090:

16 seconds

2) RTX 4070 Super:

26 seconds

I have not mentioned a RTX 4080 result because there are no current test results available. (Older results are around 26 seconds which is outdated.)

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Feb 03, 2025 Feb 03, 2025

You missed the point.

 

Results that I have posted are from a different set of test - one that I have gathered "elsewhere".


I have four different files (24mpx, 45mpx, 61mpx and 102mpx) which people have tested with 50% Ai Denoise strenght. This gives a more usable results for comparing how fast the cards are in real-life applications. Not just benchmarking "which is faster" - because that we already know.

The point of my test is to give people some insight about which card they might want to buy next as they see how much time it takes when they are doing actual work.

That is what these results are all about - and I have not gotten any results of ant 50-series card with my files and parameters I have set.

 


--
// Karo Holmberg, Loupedeck/Logitech (PO: Photoshop, Lightroom Classic, Illustrator, Capture One)
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Feb 03, 2025 Feb 03, 2025

I missed nothing:

Many people are interrested in the new NVIDIA GPU perfomance, in our case in the denoise AI performance and it is not a game changer, as my data show. I'm very confident that the here (from you) used benchmark will not show some different result.

P.S. The there (my) used AI-denoise benchmark is also available for download, inclusive an "how to".

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Feb 03, 2025 Feb 03, 2025

I will post results of 50-series card as soon as someone runs my files through.

This interests many users - that is also the reason I have four different mpx-sizes so people can actually understand how long does it take to process one image.


--
// Karo Holmberg, Loupedeck/Logitech (PO: Photoshop, Lightroom Classic, Illustrator, Capture One)
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Aug 13, 2025 Aug 13, 2025
@Karo Holmberg Logitech  schrieb:

I will post results of 50-series card as soon as someone runs my files through. ...


 

Did you post some 50-series card results?

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Sep 01, 2025 Sep 01, 2025

Unfortunately no. There were not enough testers for the 50x0-models.

I got only one 5070 result, and it was pretty much on par with the 4070 super. If i would have got more results with all the models, there could have been some conclusions made.


--
// Karo Holmberg, Loupedeck/Logitech (PO: Photoshop, Lightroom Classic, Illustrator, Capture One)
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Feb 03, 2025 Feb 03, 2025

@.Michael. 

Nobody ever expected the 50-series cards to be a "game changer". That's completely unrealistic. What we can expect is a few seconds gain, which is what we got from the 30-series to the 40-series.

 

I doubt that anyone really cares whether it's 18 seconds or 20 seconds. Either way, you sit around a wait for a little while. The faster the better of course, but these numbers are mainly to know what you can basically expect.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Feb 04, 2025 Feb 04, 2025

where can I find your 4 files please? 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Feb 03, 2025 Feb 03, 2025

I tried registering for the forum you linked so I could download the test results file, but after admin approval, I was banned. Is it limited to users in Germany?

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Aug 13, 2025 Aug 13, 2025
@elisir schrieb:

I tried registering for the forum .... Is it limited to users in Germany?


Not as I know.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Feb 03, 2025 Feb 03, 2025

Great info!   Small addition:  The 2080 TI has 11GB.  True for all of those cards that I have seen.  Photoshop seems to suck every bit of VRAM it can these days, 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Feb 18, 2025 Feb 18, 2025

I'll put my .02 in. On my system Rayzen 5950x, 32gb ram, RX6700 28 seconds on my Canon R5ii which are huge 60MB files. My sons i9 4070 super 11.5 seconds. I'll be going with RX7900 XtX see if I can get down to 15 seconds I be happy. Doing 2k pictures sucks. Manual denoise great when I'm bellow 12k iso but during Rodeos indoor with bad bad bad lighting in need Ai denoise. Tired waiting 14 hours..

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Feb 19, 2025 Feb 19, 2025

There is something else slowing down your system. Probably memory blocks are much slower than on your sons computer.


--
// Karo Holmberg, Loupedeck/Logitech (PO: Photoshop, Lightroom Classic, Illustrator, Capture One)
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Apr 29, 2025 Apr 29, 2025

Just curious if anyone has any experience or thoughts on 5070 (12gb) vs 5070ti (16gb)? It's hard to justify the fact that it's 40% more expensive here in my local market, but don't want to shoot myself in the foot not having enough VRAM. I'm particularly interested in the general LR speed improvements in all AI features such as subject detect etc.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Apr 30, 2025 Apr 30, 2025

The main thing I notice running the 5070ti is denoise. My 6700 ran 30 seconds about for 1 image of denoise. 5070ti 11.5 seconds. CPU a 5950x 64GB ram. CPU does a detection etc so not really sure if I noticed anything there. Export is also a lot faster as well.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Aug 04, 2025 Aug 04, 2025

Coming from RTX 4070Ti OC 12GB VRAM to RTX 5090 OC 32GB VRAM and I would suggest not to purchase anything less than 16GB forward on.  3 years ago I kept the RTX 4070Ti OC 12GB VRAM over the few hurnder dollars more RTX 4080 OC 16GB VRAM to save some money even though it did ran smoother than the 12GB GPU because at the time the LRC 12.3 works fine but now on LRC 14.4 it started to have more and more delays throughout the same work flow.  Switched to RTX 5090 OC (thanks to Amazon Prime "Used like new"), everything is way smoother now.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines