Skip to main content
New Participant
May 9, 2023
Question

GPU Benchmark Denoise AI

  • May 9, 2023
  • 12 replies
  • 42213 views

Hello, do you have any GPU speed benchmarks for denoising? Does Lightroom also support AMD graphics cards in the calculation or only Nvidia? I'm particularly interested in the Radeon 7900 XT. I'm not entirely satisfied with the RTX 4070.
Best regards, Mark

12 replies

Inspiring
April 30, 2025

Just curious if anyone has any experience or thoughts on 5070 (12gb) vs 5070ti (16gb)? It's hard to justify the fact that it's 40% more expensive here in my local market, but don't want to shoot myself in the foot not having enough VRAM. I'm particularly interested in the general LR speed improvements in all AI features such as subject detect etc.

Participating Frequently
April 30, 2025

The main thing I notice running the 5070ti is denoise. My 6700 ran 30 seconds about for 1 image of denoise. 5070ti 11.5 seconds. CPU a 5950x 64GB ram. CPU does a detection etc so not really sure if I noticed anything there. Export is also a lot faster as well.

Participating Frequently
August 4, 2025

Coming from RTX 4070Ti OC 12GB VRAM to RTX 5090 OC 32GB VRAM and I would suggest not to purchase anything less than 16GB forward on.  3 years ago I kept the RTX 4070Ti OC 12GB VRAM over the few hurnder dollars more RTX 4080 OC 16GB VRAM to save some money even though it did ran smoother than the 12GB GPU because at the time the LRC 12.3 works fine but now on LRC 14.4 it started to have more and more delays throughout the same work flow.  Switched to RTX 5090 OC (thanks to Amazon Prime "Used like new"), everything is way smoother now.

Karo Holmberg Logitech
Inspiring
February 3, 2025

Here are some new results on actual denoising with different graphics cards.

All of these values have been measured with Lightroom Classic, using 50 strength on Ai Denoise. All tested images are the same, and clock has been stopped at the same time as the "process bar" disappears on Lightroom Classic's UI (so not at the point when "image is ready", but when process bar disappears).

Is this scientific, hell no - but it gives quite a good estimate where we are with current cards 🙂

No 50-series NVIDIA cards tested yet on this particular test.


--// Karo Holmberg, Loupedeck/Logitech (PO: Photoshop, Lightroom Classic, Illustrator, Capture One)
Inspiring
February 3, 2025
quote

No 50-series NVIDIA cards tested yet.

By @Karo Holmberg Logitech

One RTX 5080 was already tested in my already several times linked forum , result:

19 seconds

as compare

1) RTX 4090:

16 seconds

2) RTX 4070 Super:

26 seconds

I have not mentioned a RTX 4080 result because there are no current test results available. (Older results are around 26 seconds which is outdated.)

Karo Holmberg Logitech
Inspiring
February 3, 2025

You missed the point.

 

Results that I have posted are from a different set of test - one that I have gathered "elsewhere".


I have four different files (24mpx, 45mpx, 61mpx and 102mpx) which people have tested with 50% Ai Denoise strenght. This gives a more usable results for comparing how fast the cards are in real-life applications. Not just benchmarking "which is faster" - because that we already know.

The point of my test is to give people some insight about which card they might want to buy next as they see how much time it takes when they are doing actual work.

That is what these results are all about - and I have not gotten any results of ant 50-series card with my files and parameters I have set.

 

--// Karo Holmberg, Loupedeck/Logitech (PO: Photoshop, Lightroom Classic, Illustrator, Capture One)
New Participant
October 2, 2024

The RX 7900 XT is a beast.

24mpx (35.74MB) = AI DENOISE 4Sec

50.3mpx (64.02MB) = AI DENOISE 10Sec 

Karo Holmberg Logitech
Inspiring
October 2, 2024

You are probably having an AMD based computer?
Also the files you tested are not the same ones used on the list above - that might give or take some seconds 🙂

--// Karo Holmberg, Loupedeck/Logitech (PO: Photoshop, Lightroom Classic, Illustrator, Capture One)
Karo Holmberg Logitech
Inspiring
September 5, 2024

Although this is a 1.5 years old topic, I need to drop here some information which might give some insights to users upgrading, or thinking of upgrading their graphics card.

I came up with a small not-super-accurate-test using two files (Canon 45mpx ISO6400, Nikon 24mpx ISO12800). Both are processed with AI Denoise (value 50). Timer starts when "OK" is clicked and timer stops when Lightroom Classics Process bar disappears.

I listed all the participants main data (processor, memory amount and display card) and from the results it became more or less obvious that the amount of memory does not affect the processing of AI denoise virtually at all. What I do not know: How many SATA-devices these users have - if there is too many (or in wrong ports) that will decrease the speed. This I tested on my older computer that had 6 SATA devices (2 min 7 s process time) and after removing most of them -> 2 SATA devices (37 s process time). So better to have just couple of huge hard drives than multiple small.

But anyway, screenshot of the results is at the end of the message.

In short: on AMD computer 7900XTX is pretty much on par with the 4090, just a bit slower. On intel computer it seems to lose some edge and is between 4070 and 4080. For this I only had one computer of each, so this is not verified in any way. And also the Intel-computer (with 7900XTX) contains many SATA devices which might slow it down (motherboard is Z790).
-

--// Karo Holmberg, Loupedeck/Logitech (PO: Photoshop, Lightroom Classic, Illustrator, Capture One)
D Fosse
Braniac
September 5, 2024

@Karo Holmberg Logitech 

Disk only matters insofar as writing the finished DNG to disk is included in the total processing time.

 

So naturally, with a spinning hard drive it will take slightly more time than with a fast NVMe.

 

This is a 60MP raw file saved to a spinning 18 TB drive. Green is initiate Denoise, red is complete:

Karo Holmberg Logitech
Inspiring
September 6, 2024

This is not completely the case.

If you occupy all SATA ports and M.2 ports of your motherboard's communications will slow down significantly. Here are the times I took from my old computer which had 2x M.2 and 6x SATA:
- 45 mpx: 2 min 7 sec
- 24 mpx: 2 min 7 sec

Yes, the times were identical for both files because the bus couldn't handle the data anymore (probably reserving bandwith or communicating with all SATA devices - i don't know excatly as this is out of my experties).

After removing all but necessary SATA devices, the same test (1x M.2, 2x SATA):
- 45 mpx: 37 sec
- 24 mpx: 24 sec


--// Karo Holmberg, Loupedeck/Logitech (PO: Photoshop, Lightroom Classic, Illustrator, Capture One)
New Participant
December 6, 2023

Hey, 

Yes LrC supports AMD GPU as well. I recently got my 7900 XTX. I cant compare the Performance to an 4070 12GB but i had an 3070 8GB. 

 

101 .DNG Files to .JPEG 90% Quality (EOS R5 45MP 8192x5464) MM:SS

GPU Export I9 13900K RTX 3070 128 3600 Mhz GB RAM TIME: 02:42:00
CPU Export I9 13900K RTX 3070 128 3600 Mhz GB RAM TIME: 04:52:07 CPU EXPORT 
GUP EXPORT I9 13900K RX 7900 XTX 128 2400 Mhz GB RAM TIME 1:40,87
GUP EXPORT I9 13900K RX 7900 XTX 128 3600 Mhz GB RAM TIME 1:15,50
GUP EXPORT I9 13900K RX 7900 XTX 128 3600 Mhz GB RAM ReBar TIME 1:16,00

 

AI Denois 10x  45MP EOS R5 dng files toke 1:50,77   1 min : 50 sec. 

The VRAM Use was by 15-16 GB and Power was by like 400-430W 

Inspiring
March 3, 2024

I found an interesting test summary:

dslr-forum.de KI entrauschen

I think it will clarify almost all "How is the performance with ..." questions 🙂 .

Inspiring
September 8, 2023
Thought I'd add my recent experience. I'm 81, on a budget, just built a PC with i5 12600, ASUS B760 motherboard, and a used RTX 3060 (non-ti). In my first test, Denoise AI took 10-12 seconds to process a Canon R6 20-mpx file. Very relieved, as I often batch process 200-300 pics in LR that are shot indoors with ISOs ranging from 1250 to 12000. Denoise AI promises to be immensely helpful - see a sample before and after pic here, from our school's science fair last spring. The photo was shot with the R6 at ISO 20000, and the "after" pic looks to my ancient eyes like it could have been shot at ISO 100. https://www.lwsphotos.com/Misc-Pix/Lightroom-noise-red-AI/
D Fosse
Braniac
September 8, 2023

It is indeed extremely effective - but I always advise people to not run everything through Denoise. There simply is no point in that. Denoise is for the extreme cases where you really need to push it.

 

My own experience is that the breakpoint is around ISO 6400 or thereabouts. That's when noise is starting to become a problem looking for a solution - instead of a solution looking for a problem. Below that, which means 98% of normal circumstances, standard noise reduction is more than good enough to cope with it.

 

Even so, it has never seemed like a worthwhile goal to me to remove all noise. With Denoise you can do that, but the result will easily look plasticky and unreal. With a high-resolution sensor, and a good lens, you're also getting to that point where noise and detail merge, sort of. There is noise, but that noise also contains valid detail that you don't want to lose.

 

And for some situations, that extra grittiness actually adds to the image. I've been in this business long enough to remember Kodak Tri-X developed in Agfa Rodinal. That combo was a classic, loved by many photographers, precisely because it brought out and maximized grain, and made it extremely crisp and almost tangible.

Participating Frequently
August 21, 2023

Hello.

 

I have the XTX RX7900XT 20GB "black" (overclocked and higher power limit) edition. I had to upgrade my power supply for this to run at full bore. 750 watts wasn't enough.

 

It chews through AI denoising with aplomb. HWmonitor reports that it uses about 350watts with a peak near 500 watts to do this denoising (just the graphics card - not the whole system) but boy does it fly.

 

It processes my 6D raw files in about 5 seconds each.

 

The CPU sits idle while this is happening.

 

You'll want a case with lots of airflow. You can get as much performance as you can afford.

D Fosse
Braniac
August 21, 2023

To put that into context, the 6D mark ii is a 26 MP camera, while the testing above was done on 60 MP files. Timings are roughly proportional to MP count. So to compare, that's about a 2.3x factor. If 6D mark i, that's 20 MP and 3x.

 

Which should put it roughly similar to the RTX 4080. But apparently a lot less expensive, at least for the base model, so there's that.

 

 

Participating Frequently
August 21, 2023

That sounds about right. The RX7900XT is roughly equivelent in gaming to the RTX4070Ti but it has a lot of execution units, relatively speaking and the "black" edition scores about 7% higher than the stock one so that all makes sense. These go for about $1150Cdn. I'm not sure about other markets.

Participating Frequently
July 15, 2023

I just got this done today:

 

Test Method:
Images: DP Review's A7RV 60M RAW files. Five cycles per test
System: Intel i7-6700K 4.0GHz, 32GB DDR4 2400MHz RAM, WD Black 1TB M2 SSD, Win10, 27" 1440p display, Antec P190 550W+650W (GPU use only) =1200W (16 years old) case
GTX1060 6GB GDDR5: 1-pic: 159s10-pics: 1569s Idle: 108W Average: 234W Peak: 279W
RTX3060 OC 12GB GDDR6: 1-pic: 32.08s 10-pic: Not tested Power: Not tested
RTX3060 Ti 8GB GDDRX6: 1-pic: 26.90s 10-pic: Not tested Power: Not tested
RTX3070 OC 8GB GDDR6: 1-pic 25.18s 10-pic: 221.73s Power: Idle 117W Average: 378W Peak: 585W
RTX4060 Ti 8GB GDDR6: 1-pic: 26.97s 10-pic: 247.62s Power: Idle: 108W Average: 288W Peak: 369W
RTX4070 12GB GDDRX6: 1-pic: 20.08s 10-pic: 180.2s Not tested Power: Not tested
RTX4070 OC 12GB GDDRX6: 1-pic: 19:74s 10-pic: 175.61s Power: Idle: 117W Average 324W Peak: 414W
RTX4070 Ti OC 12GB GDDRX6: 1-pic: 17.29s 10-pic: 148.81s idle: 117W average: 369W Peak: 441W
RTX4080 OC 16GB GDDRX6: 1-pic: 13.88s 10-pic: 120s 422-pic (my own 60MP images): 5330s Idle: 126W Average: 423W Peak: 576W Task Manager: CPU Average: 58% Memory: 40% GPU: 7% Power usage: High
Beside the Denoise process speeding up when testing the higher end GPU so does the refreshing speed of the 60MP image. During masking brush process at 100% zoom-in while navigating around the 60MP image it's almost instantaneous with RTX 4070 and above GPU while other cards takes a second or even a few seconds to refresh constantly from the pixelated image which makes the entire editing experience much more fluid and pleasant. Even though some GPU consumed less wattage they also take much longer time to process so the advantage is no longer there especially when I often process 50~200+ images at a time.
I hope the raw data will be helpful to someone who needs them

New Participant
September 4, 2023

To add one to your list:

1660 Ti 1 pic 1.52s    -  I used the pc of chairs 66MB from DP Review's A7RV 60M RAW gallery.   and my system is basicallly the same 6700k etc as yours.

Amazing work!  Thank you!  I was wondering what benefit I'd get with an upgrade.   It was taking me a couple minutes per pic, which gets to be old.

Participating Frequently
September 4, 2023

I'm sorry. Maybe I'm misunderstanding.  Are you saying a 1660ti is doing it in 1.5seconds?


I'm confused because that doesn't seem possible on its face. The 1660ti is a tiny substantially weak GPU which cannot get anywhere near 1.5 seconds. I might believe 25 seconds or so.

 

maybe I misunderstood your post and if that's the case, I apologize. 

New Participant
May 18, 2023

Interesting discussion. 
I was just puzzled when I saw the differences on hardware in our settings. It takes 8 seconds on my rather new PC with a Rtx-3070, i-13700, 32 Gb DDR5 RAM and a PCI-4 SSD. On the older but usually still quite capable second PC it says it would take 16 Minutes (!) but repeatedly crashes while trying to compute.... here I have a simple Gtx 1030 GPU with an AMD 3700, 32 GB DDR4 RAM and a PCI-3 SSD. So it seems I will finally upgrade this PC ....

Known Participant
May 11, 2023

With my GeForce RTX 2060 6GB RAM it used about 15 sec using 100% of the RAM on my 24MP SONY ARW files. Some times it could use a minute or two denoising some, probably very noisy, pictures.
I moved to a  GeForce RTX 4070 12GB RAM and is now down to about 6 sec on the same files, still using about 6GB RAM during the process.
Adobe recommend at least 8GB RAM and I think I can see why.

Thor Egil LeirtrøFreelance concert photographer - thoregilphoto.com