Exit
  • Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
  • 한국 커뮤니티
6

GPU Benchmark Denoise AI

New Here ,
May 09, 2023 May 09, 2023

Hello, do you have any GPU speed benchmarks for denoising? Does Lightroom also support AMD graphics cards in the calculation or only Nvidia? I'm particularly interested in the Radeon 7900 XT. I'm not entirely satisfied with the RTX 4070.
Best regards, Mark

TOPICS
Windows
37.8K
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Jan 21, 2025 Jan 21, 2025

OK, that's faster than I expected. Most people with older GPUs report many minutes.

 

Anyway, the RTX 4000-series is rock solid and works very well with Photoshop. The sweet spot in terms of price : performance is very clearly the 4060. The higher models (70/80/90) will be faster - a 4090 probably well under 10 seconds - but the price goes up fast and may be hard to justify.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jan 21, 2025 Jan 21, 2025

Maybe because I have a powerful CPU too, 16 cores.
Then I tell myself that the 1080Ti has an advantage because it has 11GB of RAM.
I would perhaps wait to see the 5000 Series since they will be available soon.
I don't think I need the 70-80-90, if it's to save 5 seconds...

 

thank you 🙂 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Jan 21, 2025 Jan 21, 2025
quote

Maybe because I have a powerful CPU too, 16 cores.

By @motivated_enthusiastB836

 

While more CPU cores and performance are useful in most parts of Lightroom Classic, the CPU doesn’t affect Denoise very much…Denoise speed is far more dependent on GPU than on any other component (RAM, CPU, storage…). 

 

You can verify this by checking system resource usage while running Denoise. I’m guessing you’ll find that CPU usage is very low at that time.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Aug 21, 2023 Aug 21, 2023

Hello.

 

I have the XTX RX7900XT 20GB "black" (overclocked and higher power limit) edition. I had to upgrade my power supply for this to run at full bore. 750 watts wasn't enough.

 

It chews through AI denoising with aplomb. HWmonitor reports that it uses about 350watts with a peak near 500 watts to do this denoising (just the graphics card - not the whole system) but boy does it fly.

 

It processes my 6D raw files in about 5 seconds each.

 

The CPU sits idle while this is happening.

 

You'll want a case with lots of airflow. You can get as much performance as you can afford.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Aug 21, 2023 Aug 21, 2023

To put that into context, the 6D mark ii is a 26 MP camera, while the testing above was done on 60 MP files. Timings are roughly proportional to MP count. So to compare, that's about a 2.3x factor. If 6D mark i, that's 20 MP and 3x.

 

Which should put it roughly similar to the RTX 4080. But apparently a lot less expensive, at least for the base model, so there's that.

 

 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Aug 21, 2023 Aug 21, 2023

That sounds about right. The RX7900XT is roughly equivelent in gaming to the RTX4070Ti but it has a lot of execution units, relatively speaking and the "black" edition scores about 7% higher than the stock one so that all makes sense. These go for about $1150Cdn. I'm not sure about other markets.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Aug 21, 2023 Aug 21, 2023

I should have added that I have upgraded from an RTX2070 and that the RX7900XT (black) appears to be about 3x faster in denoising.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Sep 08, 2023 Sep 08, 2023
Thought I'd add my recent experience. I'm 81, on a budget, just built a PC with i5 12600, ASUS B760 motherboard, and a used RTX 3060 (non-ti). In my first test, Denoise AI took 10-12 seconds to process a Canon R6 20-mpx file. Very relieved, as I often batch process 200-300 pics in LR that are shot indoors with ISOs ranging from 1250 to 12000. Denoise AI promises to be immensely helpful - see a sample before and after pic here, from our school's science fair last spring. The photo was shot with the R6 at ISO 20000, and the "after" pic looks to my ancient eyes like it could have been shot at ISO 100. https://www.lwsphotos.com/Misc-Pix/Lightroom-noise-red-AI/
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Sep 08, 2023 Sep 08, 2023

It is indeed extremely effective - but I always advise people to not run everything through Denoise. There simply is no point in that. Denoise is for the extreme cases where you really need to push it.

 

My own experience is that the breakpoint is around ISO 6400 or thereabouts. That's when noise is starting to become a problem looking for a solution - instead of a solution looking for a problem. Below that, which means 98% of normal circumstances, standard noise reduction is more than good enough to cope with it.

 

Even so, it has never seemed like a worthwhile goal to me to remove all noise. With Denoise you can do that, but the result will easily look plasticky and unreal. With a high-resolution sensor, and a good lens, you're also getting to that point where noise and detail merge, sort of. There is noise, but that noise also contains valid detail that you don't want to lose.

 

And for some situations, that extra grittiness actually adds to the image. I've been in this business long enough to remember Kodak Tri-X developed in Agfa Rodinal. That combo was a classic, loved by many photographers, precisely because it brought out and maximized grain, and made it extremely crisp and almost tangible.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Sep 14, 2023 Sep 14, 2023

Update. I ran 223 20-mpx Canon R6 CR3 RAW photos through LR Denoise AI on the above-described PC with ASUS RTX 3060 12GB GPU (non-ti). Denoise took 37:30 min. to process the 223 pics, or a little over 10 seconds per pic, I am extremely pleased, both by the speed and the quality of results. I set Denoise AI to 30 instead of the default of 50 which can make skin look a little unnatural in some (but not all) situations. To repeat, the pics look wonderful, which is awesome because I shoot a lot of moving subjects indoors and can't afford to be too prissy-pure about ISO. I come from a sports and journo photo background where the subject and story matters much more than Ansel Adams-level grain quality. In some situations grain is wonderful, as it adds to the desired mood. Just not in little kids' faces!

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Oct 25, 2023 Oct 25, 2023

I also set it at 30 or 35, and it improves EVERY photo.  No plastic skin. I have a huge Mac Studio (M2 Ultra, 24 cores, 128GB memory, 60 GPU cores) with an Apple 5K display. My 50MB raw images are taking 12 to 19 seconds to denoise. I'm thinking that should get faster as they tweak the performance, because I can't really throw any more hardware at it, and some folks are getting better numbers. On my previous system, it took 2 to 3 minutes per image, which was not usable.  12 to 19 seconds is perfectly tolerable, but I hope LR learns to take better advantage of this hardware.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Oct 25, 2023 Oct 25, 2023
quote

My 50MB raw images are taking 12 to 19 seconds to denoise...some folks are getting better numbers. On my previous system, it took 2 to 3 minutes per image, which was not usable.

By @jgiamma

 

From seeing the various threads, there still seem to be a large number of users who have hardware that takes a minute or more per image. Some may be getting better numbers, but times like yours in the 10 to 20 second range are basically as good as it gets on Macs and PCs for now, and are at the top of the pyramid until they can make AI Denoise run faster. My less powerful M1 Pro laptop takes 20 to 70 seconds per image, which is fine for my purposes and budget, but of course I too hope Adobe can shorten that.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Nov 03, 2023 Nov 03, 2023

George I upvoted you because I hope and pray when I am over 80 years old that I still enjoy taking photos and participating in the community.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Dec 06, 2023 Dec 06, 2023

Hey, 

Yes LrC supports AMD GPU as well. I recently got my 7900 XTX. I cant compare the Performance to an 4070 12GB but i had an 3070 8GB. 

 

101 .DNG Files to .JPEG 90% Quality (EOS R5 45MP 8192x5464) MM:SS

GPU Export I9 13900K RTX 3070 128 3600 Mhz GB RAM TIME: 02:42:00
CPU Export I9 13900K RTX 3070 128 3600 Mhz GB RAM TIME: 04:52:07 CPU EXPORT 
GUP EXPORT I9 13900K RX 7900 XTX 128 2400 Mhz GB RAM TIME 1:40,87
GUP EXPORT I9 13900K RX 7900 XTX 128 3600 Mhz GB RAM TIME 1:15,50
GUP EXPORT I9 13900K RX 7900 XTX 128 3600 Mhz GB RAM ReBar TIME 1:16,00

 

AI Denois 10x  45MP EOS R5 dng files toke 1:50,77   1 min : 50 sec. 

The VRAM Use was by 15-16 GB and Power was by like 400-430W 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Mar 03, 2024 Mar 03, 2024

I found an interesting test summary:

dslr-forum.de KI entrauschen

I think it will clarify almost all "How is the performance with ..." questions 🙂 .

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Sep 05, 2024 Sep 05, 2024

Although this is a 1.5 years old topic, I need to drop here some information which might give some insights to users upgrading, or thinking of upgrading their graphics card.

I came up with a small not-super-accurate-test using two files (Canon 45mpx ISO6400, Nikon 24mpx ISO12800). Both are processed with AI Denoise (value 50). Timer starts when "OK" is clicked and timer stops when Lightroom Classics Process bar disappears.

I listed all the participants main data (processor, memory amount and display card) and from the results it became more or less obvious that the amount of memory does not affect the processing of AI denoise virtually at all. What I do not know: How many SATA-devices these users have - if there is too many (or in wrong ports) that will decrease the speed. This I tested on my older computer that had 6 SATA devices (2 min 7 s process time) and after removing most of them -> 2 SATA devices (37 s process time). So better to have just couple of huge hard drives than multiple small.

But anyway, screenshot of the results is at the end of the message.

In short: on AMD computer 7900XTX is pretty much on par with the 4090, just a bit slower. On intel computer it seems to lose some edge and is between 4070 and 4080. For this I only had one computer of each, so this is not verified in any way. And also the Intel-computer (with 7900XTX) contains many SATA devices which might slow it down (motherboard is Z790).
-


--
// Karo Holmberg, Loupedeck/Logitech (PO: Photoshop, Lightroom Classic, Illustrator, Capture One)
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Sep 05, 2024 Sep 05, 2024

@Karo Holmberg Logitech 

Disk only matters insofar as writing the finished DNG to disk is included in the total processing time.

 

So naturally, with a spinning hard drive it will take slightly more time than with a fast NVMe.

 

This is a 60MP raw file saved to a spinning 18 TB drive. Green is initiate Denoise, red is complete:

denoise.png

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Sep 06, 2024 Sep 06, 2024

This is not completely the case.

If you occupy all SATA ports and M.2 ports of your motherboard's communications will slow down significantly. Here are the times I took from my old computer which had 2x M.2 and 6x SATA:
- 45 mpx: 2 min 7 sec
- 24 mpx: 2 min 7 sec

Yes, the times were identical for both files because the bus couldn't handle the data anymore (probably reserving bandwith or communicating with all SATA devices - i don't know excatly as this is out of my experties).

After removing all but necessary SATA devices, the same test (1x M.2, 2x SATA):
- 45 mpx: 37 sec
- 24 mpx: 24 sec



--
// Karo Holmberg, Loupedeck/Logitech (PO: Photoshop, Lightroom Classic, Illustrator, Capture One)
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Sep 05, 2024 Sep 05, 2024

Which M3 was tested at the bottom? The competitive 16.4 second time for the 45 megapixel file suggests that it was probably one of the higher end M3 models with more GPU cores? I would guess the more affordable M3 Macs would be more like 30-40 seconds for 45MP.

 

My understanding of memory and AI Denoise is that if there is too little memory it will run slower than optimal, but if there is enough memory, adding more will not make it any faster, as you have found. Also, I think graphics memory affects AI Denoise more than system memory. AI Denoise really seems to be all about GPU power primarily, with a secondary factor being any AI-specific accelerators available (if supported). For example, in recent versions of Lightroom Classic, the Apple Neural Engine NPU shortens AI Denoise time a little more, but it’s still the GPU that does the heavy lifting. For Windows, I’m curious how much further the times might fall if Adobe enables support for the NPU in Copilot+ PCs.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Sep 06, 2024 Sep 06, 2024

Checked it from the original denoise chain in our group, it was "m3 max 16 core 128g" - hopefully that is enlightning, I don't know enough about macs to know if it is one of the higher end models or not :3



--
// Karo Holmberg, Loupedeck/Logitech (PO: Photoshop, Lightroom Classic, Illustrator, Capture One)
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Sep 06, 2024 Sep 06, 2024

Thanks Karo, that’s what I was expecting given that time result.

 

To put that into context for Mac users, and also for anyone who’s interested…the Apple processor tiers work like this, with their impact on AI Denoise:

 

Base: M1, M2, M3

CPU cores: 8

GPU cores: 8 to 10 (slowest AI Denoise on Mac)

 

Pro: M1 Pro, M2 Pro, M3 Pro

CPU cores: 11 to 12

GPU cores: 14 to 18 (OK AI Denoise speed)

 

Max: (the Mac in Karo’s office)

CPU cores: 14 to 16

GPU cores: 30 to 40 (fast AI Denoise)

 

Ultra:

CPU cores: 24 

GPU cores: 60 to 76 (fastest possible AI Denoise on Mac) 

 

As shown by Karo’s chart, the Max (and also Ultra) tier has enough GPU cores to achieve AI Denoise speeds competitive with good PC graphics cards.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Sep 09, 2024 Sep 09, 2024
quote

... 

As shown by Karo’s chart, the Max (and also Ultra) tier has enough GPU cores to achieve AI Denoise speeds competitive with good PC graphics cards.


By @Conrad_C

As I know your information is outdated. The LrC AI denoise is using the Mx included "Neural Engine" since 2024-05.

e.g.: MacBook Pro M3 Max (14 / 30 C, 36 GB RAM):

before 2024-05: 57 s

since 2024-05: 48 s

Several results are (as already above mentioned) here (dslr-forum.de) . Test conditions are in the start post.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Sep 09, 2024 Sep 09, 2024

Yes, I’m aware of Lightroom Classic using the Apple Neural Engine NPU now. I didn’t mention it because it remains true that the GPU does the bulk of the work, so on the Mac at least, the more powerful the GPU, the less the ANE improves the result.

 

I only have an M1 Pro (fewer GPU cores), so the Apple Neural Engine makes a larger difference in lowering my AI Denoise times, and of course I appreciate that.

 

Now that an NPU is a requirement for Windows Copilot+ PCs, it will be interesting to see how long it takes for Lightroom Classic to support that NPU for reducing AI Denoise times on Windows laptops.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Sep 10, 2024 Sep 10, 2024

This data was actually gathered to compare Windows computers, to give some insights to people which graphics card would be best for their budget. Mac's were added just out of curiosity. As very few people would swap their system just because of a GPU, there is actually no real-life need to compare Mac's and PC's in this matter.


--
// Karo Holmberg, Loupedeck/Logitech (PO: Photoshop, Lightroom Classic, Illustrator, Capture One)
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Oct 02, 2024 Oct 02, 2024

Its a useful metric to someone thinking of buying a computer and not knowning which platform to consider.

My M1 Mac mini runs denoise on 30MP 5D Mark IV files in about 11 seconds, reading from and writing to USB 3.0 hard drives.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines