Skip to main content
Participating Frequently
April 20, 2023
Answered

12 Minute render in Denoise AI in 12.3

  • April 20, 2023
  • 7 replies
  • 10392 views

I was interested to try out the new Denoise AI in 12.3. It seemed to work pretty well in some preliminary tests, though 1 image produced a ton of green artifacts in what started ot as a black sky (with a lot of noise).

The major issue for me, though, is that the module reports it will take it 12 minutes or more to render the DNG file. I didn't time this, but it was certainly more than 10 minutes. I've watched a demo/tutorial and their video showed expected times in the 10 to 15 SECOND reange. Nothing else I use in LR has major delays.

My computer is 5 years old and won't support Windows 11, but nothing else takes an outrageous amount of time. Exporting DNG files for Panoramas might take a few seconds, but nothing like this. Is this a processor issue for me or are others experiencing this sort of time issue?

This topic has been closed for replies.
Correct answer dj_paige

I think it depends on the speed and amount of memory of your GPU. The AI noise reduction task is the most computationally intensive tool in LrC. A 5 year old computer most likely does not have the GPU power to do this in 10-15 seconds.

7 replies

New Participant
May 12, 2023

Might Adobe publish a list of tested and approved Graphics Cards that will work well with these new LRC AI functions?  I happen to be on a Win 10 PC with an AMD Ryzen 7 3700X 8-Core processor, 32GB of Ram, and an Nvidia GeForce GTX 1660 graphics card with 6GB ram.  I have recently updated to the latest Studio driver for the card.  I am still having occasional issues with Denoise and some of the other AI functions.  I  often will get a Runtime Error which shuts down the GPU and usually required a reboot of LRC.  Also, the Performance tab under Preferences reports I have "Limited graphics acceleration."  What is required to get Full Graphics Acceleration and get the most performance from LRC ?

Participating Frequently
May 18, 2023

Hello,

 

Not sure if this is the right thread, but all other threads which might to be related had been closed....  

https://community.adobe.com/t5/camera-raw-discussions/denoise-ai-takes-10-minutes-to-render/td-p/13750965/page/2

https://community.adobe.com/t5/lightroom-classic-discussions/ai-denoise-takes-4-8-minutes/td-p/13746341 
However, I couldn't find any information about partial problems which only appear sometimes.

 

I also had partial problems with then AI denoise. The performance drops after denoising some images, from about 10-20 seconds to several minutes. When it is slowed down, it also starts flickering as described in some other threads. This does not appear as long the it works fast.

Restarting Lightroom solves the issue until it appears again... I already read through some threads and I think the AI denoise feature is still buggy.
I really hope for an update that will fix these problems soon. 


I run Windows 11 on a Schenker XMG Core Notebook with Nvidia GTX 1660Ti with latest GPU drivers installed.
Lightroom also says that it only suports my gpu partly (at the moment).  GPU Usage is set to auto. 

 

Best regards,
Alex

 

Known Participant
April 21, 2023

I have a 5+ year old PC with 32 Gigs of RAM and an i3 Intel chip. Toapz DeNoise and AI Photo processes the converted JPEG in <30 sec. So assume the delay is also exacerbated (pulled that one out) by converting to and processing the DNG rather than a JPEG.

Keith Reeder
Participating Frequently
April 22, 2023

Which camera?

 

What GPU?

 

(We need context for your timings to have relevance - RAM and CPU aren't really the issue here.)

Known Participant
April 25, 2023

Sorry. Z9, no GPU. Just onboard Intel graphics chip

Community Expert
April 20, 2023

As pointed out, it is the GPU that matters and whether it has dedicated AI/ML cores. on my M1 Max it indeed takes 10-15 seconds for a Z7 raw file but the GPU cores and the AI/ML dedicated cores are crazy on those machines. You're not going to see that with anything older than a few years unfortunately. Also, the quality of the denoise is lightyears (sorry for the pun) ahead of Topaz denoise or even on raw files using Photo AI in my testing with far fewer artifacts. Quite a bit slower but far better. Adobe's enhance includes the demosaic step in the calculations which is one reason why it is slower but much better quality.

D Fosse
Community Expert
April 21, 2023

I have to revise those numbers above slightly (repeated with stopwatch). 60 MP at 33 seconds.

 

When I get into work later I'll try with the older Quadro P2200 GPU on that machine, I expect several minutes there on the same files.

 

EDIT: Indeed. 3 minutes and 15 seconds, on the very same file.

 

Conclusion: you may not need a very fancy GPU, but you need a new one.

D Fosse
Community Expert
April 21, 2023

Checked some older 36 MP files, and the timings are reduced in proportion. So it seems there is little overhead, and timings are more or less directly dependent on pixel size.

Ian Lyons
Community Expert
April 20, 2023

There are GPUs and then there are GPUs with cores dedicated to Machine Learning / AI tasks. For example, Tensor cores on more recent NVidia GPU cards and Neural Engine on Aple Silicon Macs. The ML / AI cores make a huge difference to the time taken for any of the AI based tasks in LrC / LrD and ACR.

 

What does above mean?

 

Take a typical first generation M1 Mac from late 2020 - it has 8 GPU cores and 16 Neural Engine cores. Time to Denoise a Canon EOS R5 file is around 50 seconds. Now take the most powerful of the M1 generation Macs from mid 2022 - the M1 Ultra has 48 GPU cores and 32 Neural Engine cores, and can process the same EOS R5 file in 15 seconds.

 

 

Ian Lyons
Community Expert
April 22, 2023

I need to correct myself as the information provided re Denoise using the Neural Engine is not correct. Currently, the Apple Neural Engine found in M1/M2 silicon Macs is not used by Denoise. It's expected that performance will be even better when Denoise does use the Apple Neural Engine. 

 

There is no change to above comments regarding the use of Tensor cores in NVidia cards. That is, Tensor cores are used by Denoise.

Ian Lyons
Community Expert
April 25, 2023

Where do you have this information from? The original article says 
Finally, we built our machine learning models to take full advantage of the latest platform technologies, including NVIDIA’s TensorCores and the Apple Neural Engine. Using these technologies enables our models to run faster on modern hardware.


Apparently, an issue on Apple side means that the Neural Engine is not used by Adobe Denoise. When the issue is addressed, then Adobe Denoise is ready to take advantage of the Neural Engine.

D Fosse
Community Expert
April 20, 2023

Yes, the GPU is pretty crucial here. I've run it a few times on two machines with different GPU, but otherwise almost identical hardware.

 

One machine has an RTX 3060 with 12 GB, and it takes about 15 - 20 seconds on 45-60 MP raw files.

 

The othe has an older Quadro P2200 with 5 GB, and here it takes 2 - 3 minutes on the same files.

 

Everything works without problems, but the speed difference is dramatic.

Keith Reeder
Participating Frequently
April 20, 2023

Yep, what DJ said.

 

For context, my (Win 10) PC takes about a minute to render Canon R7 cRAW files - this is a machine with a far-from-cutting-edge Radeon RX580 8gb - so it's probably safe to assume that your GPU is markedly less powerful than that.

 

As a further point of reference, the same machine will spit out a tiff of the same file from On1 Photo Raw 2023 in about 10-15 seconds, and the results are significantly better.

 

(Why Adobe insists on creating a DNG file in preference to writing a tiff or a jpeg, baffles me. It means that I can no longer go from RAW file to finalised output within Lr, in one run - which adds precisely nothing of value to the experience.)

dj_paigeCorrect answer
Brainiac
April 20, 2023

I think it depends on the speed and amount of memory of your GPU. The AI noise reduction task is the most computationally intensive tool in LrC. A 5 year old computer most likely does not have the GPU power to do this in 10-15 seconds.

Participating Frequently
April 20, 2023

thanks to all. This is the answer I was expecting, but hoped for a different result.

My computer runs Intel i5 @ 2.5 gHz with 20 GB Ram. Nothing else seems to take a long time to complete inside LR, though it is noticeably slower than what I can see others experrience in other editing in demonstrations. Topaz AI is slow, sometimes taking 45 seconds on my R6 RAW files but 13 minutes with not much else affected--it just seemed wrong. I've been considering buying a new machine. Five years old does seem to be the life span of computers, even with no problems encountered.