Exit
  • Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
  • 한국 커뮤니티
6

GPU Benchmark Denoise AI

New Here ,
May 09, 2023 May 09, 2023

Hello, do you have any GPU speed benchmarks for denoising? Does Lightroom also support AMD graphics cards in the calculation or only Nvidia? I'm particularly interested in the Radeon 7900 XT. I'm not entirely satisfied with the RTX 4070.
Best regards, Mark

TOPICS
Windows
37.7K
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
May 09, 2023 May 09, 2023

I use a Radeon RX580 8gb, and it's fine - not super-fast, but fast enough.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
May 09, 2023 May 09, 2023

May I ask what about the RTX 4070 causes you to be "not entirely satisfied"?

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 09, 2023 May 09, 2023

Ki-Denoise:

24 Megapixel... 6seconds

36 Megapixel... 9seconds

 

If I develop maybe 50 DNG ki denoised pictures with some additional masks and adaptive presets, lightroom is a little bit slow while zooming in for maybe further individual retouching. Its getting slower while editing from picture to picture. After DNG evaluation I have also generated smart views and 1:1 views at first before all editing.

 

I think about editing wedding series with 2000-3000 pictures and that's maybe a problem.

 

I am using an i9 9900X, 128gb ddr4 2666 MHz, Samsung Nvme.

 

Maybe the rtx 4070 has to less ram with 12gb?

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 09, 2023 May 09, 2023

That's about as good as it gets. There's no reason to think it will improve significantly with a different GPU.

 

This is an extremely compute-intensive operation. There's no way this will ever be "instant" with the hardware available today, if that's what you expect.

 

It's a bit tricky to directly compare because people have different cameras with different resolution, and this is directly proportional to megapixel count. 24 MP will be twice as fast as 48 MP.

 

I use a 60 MP camera, and with my fairly modest RTX 3060 it finishes in 33 seconds. So, mostly out of curiosity, I sent a file to a friend with an RTX 3090, and that finished in 13 seconds. So definitely a difference - but the point is, it takes a little time. You have to accept that.

 

Aside from that, there really is no reason to run Denoise on all your files. On normal low ISO photos, noise is simply not a problem, and standard noise reduction is more than enough to deal with it.

 

Denoise is for those shots taken under particularly difficult conditions, where you may have to push ISO to 5000 or more. That's when Denoise is starting to make sense, and in those circumstances, it's extremely effective.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
May 09, 2023 May 09, 2023

Way faster than most people are seeing, Markus - you're definitely at the top end of the performance scale there.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
May 09, 2023 May 09, 2023

24.2MP, 12 seconds*

MBP Pro 16" 2022 32-Core GPU | 16-Core Neural Engine/64GB.

Due to some Apple bugs (we can go there), this should speed up even more once fixed. 

* part of the time is spent writing the DNG, which doesn't use GPU and is affected by the HD. 

Author “Color Management for Photographers" & "Photoshop CC Color Management/pluralsight"
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 09, 2023 May 09, 2023

Yeah, those are awesome times. In general, based on random posts seen here and elsewhere,  it seems to work out like this:

  • Top end new graphics hardware with machine learning optimizations: Under 10 seconds 
  • Upper end recent graphics: Well under 1 minute 
  • Somewhat recent graphics hardware without support for machine learning: Several minutes 
  • Graphics hardware several years old, especially old integrated graphics with low VRAM: 30 to 45 minutes

 

Most people seem to be reporting times in the middle two ranges. Very few seem to report under 10 seconds.

 

Because this is the first release of AI Denoise, and with Adobe having publicly stated potential areas of future improvement, it is possible that all of these times could come down assuming the feature can be further optimized.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Sep 10, 2023 Sep 10, 2023

Just for some perspective, the old computer I use takes 12 minutes per photo 😂

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
May 11, 2023 May 11, 2023

With my GeForce RTX 2060 6GB RAM it used about 15 sec using 100% of the RAM on my 24MP SONY ARW files. Some times it could use a minute or two denoising some, probably very noisy, pictures.
I moved to a  GeForce RTX 4070 12GB RAM and is now down to about 6 sec on the same files, still using about 6GB RAM during the process.
Adobe recommend at least 8GB RAM and I think I can see why.

Thor Egil Leirtrø
Freelance concert photographer - thoregilphoto.com
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 18, 2023 May 18, 2023

Interesting discussion. 
I was just puzzled when I saw the differences on hardware in our settings. It takes 8 seconds on my rather new PC with a Rtx-3070, i-13700, 32 Gb DDR5 RAM and a PCI-4 SSD. On the older but usually still quite capable second PC it says it would take 16 Minutes (!) but repeatedly crashes while trying to compute.... here I have a simple Gtx 1030 GPU with an AMD 3700, 32 GB DDR4 RAM and a PCI-3 SSD. So it seems I will finally upgrade this PC ....

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jul 15, 2023 Jul 15, 2023

I just got this done today:

 

Test Method:
Images: DP Review's A7RV 60M RAW files. Five cycles per test
System: Intel i7-6700K 4.0GHz, 32GB DDR4 2400MHz RAM, WD Black 1TB M2 SSD, Win10, 27" 1440p display, Antec P190 550W+650W (GPU use only) =1200W (16 years old) case
GTX1060 6GB GDDR5: 1-pic: 159s10-pics: 1569s Idle: 108W Average: 234W Peak: 279W
RTX3060 OC 12GB GDDR6: 1-pic: 32.08s 10-pic: Not tested Power: Not tested
RTX3060 Ti 8GB GDDRX6: 1-pic: 26.90s 10-pic: Not tested Power: Not tested
RTX3070 OC 8GB GDDR6: 1-pic 25.18s 10-pic: 221.73s Power: Idle 117W Average: 378W Peak: 585W
RTX4060 Ti 8GB GDDR6: 1-pic: 26.97s 10-pic: 247.62s Power: Idle: 108W Average: 288W Peak: 369W
RTX4070 12GB GDDRX6: 1-pic: 20.08s 10-pic: 180.2s Not tested Power: Not tested
RTX4070 OC 12GB GDDRX6: 1-pic: 19:74s 10-pic: 175.61s Power: Idle: 117W Average 324W Peak: 414W
RTX4070 Ti OC 12GB GDDRX6: 1-pic: 17.29s 10-pic: 148.81s idle: 117W average: 369W Peak: 441W
RTX4080 OC 16GB GDDRX6: 1-pic: 13.88s 10-pic: 120s 422-pic (my own 60MP images): 5330s Idle: 126W Average: 423W Peak: 576W Task Manager: CPU Average: 58% Memory: 40% GPU: 7% Power usage: High
Beside the Denoise process speeding up when testing the higher end GPU so does the refreshing speed of the 60MP image. During masking brush process at 100% zoom-in while navigating around the 60MP image it's almost instantaneous with RTX 4070 and above GPU while other cards takes a second or even a few seconds to refresh constantly from the pixelated image which makes the entire editing experience much more fluid and pleasant. Even though some GPU consumed less wattage they also take much longer time to process so the advantage is no longer there especially when I often process 50~200+ images at a time.
I hope the raw data will be helpful to someone who needs them

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Sep 03, 2023 Sep 03, 2023

To add one to your list:

1660 Ti 1 pic 1.52s    -  I used the pc of chairs 66MB from DP Review's A7RV 60M RAW gallery.   and my system is basicallly the same 6700k etc as yours.

Amazing work!  Thank you!  I was wondering what benefit I'd get with an upgrade.   It was taking me a couple minutes per pic, which gets to be old.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Sep 04, 2023 Sep 04, 2023

I'm sorry. Maybe I'm misunderstanding.  Are you saying a 1660ti is doing it in 1.5seconds?


I'm confused because that doesn't seem possible on its face. The 1660ti is a tiny substantially weak GPU which cannot get anywhere near 1.5 seconds. I might believe 25 seconds or so.

 

maybe I misunderstood your post and if that's the case, I apologize. 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Sep 04, 2023 Sep 04, 2023

I just re-read and realized you're talking about a 60megapixel file. That GPU should take something like a minute to do that. Do you maybe mean 1.52 minutes rather than seconds?

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Sep 06, 2023 Sep 06, 2023

 1min 52 seconds!  Sorry about that!   

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Oct 08, 2023 Oct 08, 2023

You can add another:

 

RTX4090 24GB DDR6 in an i9 with 128GB DDR5 - 60 megapixel Sony picture, just over 6 seconds ...

 

From task manager, this does hit the Graphics Card processor, and the GEFORCE configuration does tell me I am configured to use the GPU for image processing in LrC. 

 

I have not figured out how folks are getting accurate timings.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Oct 08, 2023 Oct 08, 2023
quote

I have not figured out how folks are getting accurate timings.


By @OscarPhilips

 

Your phone probably has a stopwatch? Of course the faster it gets, the more coordinated your fingers have to be.

 

I just said in another thread that no GPU had managed to break the ten second barrier yet, but the 4090 apparently does.

 

I have a 3060 in one machine and a 4060 in another, and I notice the 40-series seems to be about 20-25% faster

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jul 10, 2025 Jul 10, 2025

Whould you know what is the stop-watch time now when performing the same process on Lightroom Classic 14.4 from the moment the "Denoise" box is clicked till it fully stops?

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Dec 10, 2023 Dec 10, 2023

A bit late to the party but also ran the DP Review Chair picture on an RX 580, 2800x and 16gb ram where it took 2 minutes. On an 5700xt (OC), 5800x and 16gb ram it took 1 minute. So looks like there is a small advantage going nvidea over amd. Also looking at other benchmarks the returns compared to cost start to deminish when going above the RTX4070/RX7800XT.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Mar 02, 2024 Mar 02, 2024

Excellent resource!!!

 

Now if you could share your RAW file so that we can test on our GPU and post the results, I believe it would be to everyone's benefit.

 

Cheers!

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Apr 21, 2024 Apr 21, 2024

Denoise  AI Lightroom Classic 13.2

Images: DP Review's A7RV 60M RAW files. 

System: AMD Ryzen 5600G, 32GB DDR4 3200MHz RAM, WD Black 1TB HDD 7200rpm, Win11, 24" 1080p display, Corsair 550W.

RX 580 8GB: 1-pic: 136,4s 10-pic: 1.364s 

 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jan 20, 2025 Jan 20, 2025

Your test interests me very much.
Currently I have a Geforce 1080Ti and also an A7R5
So I should have a good gain on this function if I upgrade my graphics card (my processor is a threadripper 1950X)

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Jan 21, 2025 Jan 21, 2025

@motivated_enthusiastB836 

 

I use an a7r5, and an RTX 4060 finishes Denoise in 28 seconds. I'm guessing the 1080 will take several minutes?

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jan 21, 2025 Jan 21, 2025

I just did a test, 68 seconds

I tell myself that upgrading my graphics card could be useful (I don't play)

I wonder if in exporting the gain is as important

( i use google translate) 

 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines