Skip to main content
Inspiring
December 31, 2023
Question

Denoise speed is the reverse of what I expected...

  • December 31, 2023
  • 2 replies
  • 2100 views

This came up because I need to do a presentation on some of the new tools in a week, and wanted to make sure the laptop was up to the challenge of doing Denoise and Gen Fill and Gen Expand and so on.

 

I found an old discussion from April where the answer kept being "get a faster CPU".  In my case, the faster CPU takes longer, and neither computer is even using the cpu for Denoise.

 

Desktop is i9-12900, 64GB, NVidia 4060 Ti 8GB, laptop is a 4-year old Dell 7740 with an i9-9880, 64GB, and a Radeon RTX 3000 6GB.  Both systems are Windows 11 Pro running 23H2.  Both systems have a 25GB cache and everything is living on NVMe SSDs on both systems.

 

Same image on both systems, in the same LR catalog (copied the whole catalog from desktop to laptop.)  Image is from an Alpha 1 in crop sensor mode, so it's about 20 megapixels.  Small.

I ran LR Denoise on the image on the desktop.  It consistently takes 1:53 +/- about 2 seconds - close to 2 minutes.

I ran LR Denoise on the image on the laptop.  It consistently takes 15 seconds +/- about 2 seconds.

 

The results from both look the same and while processing both systems ACT the same.  On the desktop the GPU goes to 94-98% busy for the entire time and the CPU sits at around 3%.  On the laptop the GPU goes to 95-99% busy and the CPU sits at around 5-6% the entire time.

 

On the desktop I turned OFF the GPU, restarted and got a popup telling me things would faster if I turned the GPU on, and I pressed CANCEL!  Ran Denoise on the desktop and the GPU STILL went to 95-98% busy.

 

Everything else in Lightroom (and Photoshop) that I've tried that takes long enough to measure is considerably faster on the desktop - Generative fill doing the same thing on the same image is always 30-40% faster on the desktop.  LR Photo Merge is faster on the desktop.  LR Select Subject (mask) is 3.1 +/- .2 on the desktop with the GPU OFF, and 5.3 +/- .2 on the laptop.   With the GPU ON and Lightroom restarted, the Select Subject is about 1.4 seconds, so significantly faster, as is my pereception of most things when I do a gross comparison of desktop to laptop.

 

The one anomoly is Denoise speed. 

 

This topic has been closed for replies.

2 replies

Conrad_C
Community Expert
Community Expert
January 1, 2024
quote

I found an old discussion from April where the answer kept being "get a faster CPU".

By @DavePinMinn

 

If you have the link to that discussion, it would be interesting to review. Just about every other discussion about AI Denoise since it was released has talked about how critically important it is to have a recent, powerful GPU, and how unimportant the CPU is.

Inspiring
January 1, 2024

I have no idea how to point y'all at the discussion, but it's "Lightroom Denoise is Remarkably slow" in THIS forum.  It may have been that one that got dozens of replies with "just like everybody else that's complaining about Denoise, you need a faster CPU", OR it may have been one of the other ones I found........

 

In general, if I can find a clue ANYWHERE, I'd rather do that than have to ask a question in here.  Not that y'all aren't great, but my questions are usually small things that are annoyances rather than disasters.

 

But, other than the sidetrip, does anybody have a useful idea why a 4-year-old, midrange laptop is running Denoise 8 TIMES faster than my desktop?  Is there something magical about an elderly Radeon RTX 3000 from 2019 that it can do Denoise faster than an NVidia 4060?  That would be depressing...

Inspiring
January 3, 2024

I'm not an expert on GPUs either, and I've had trouble even finding anything that compares the Quadro RTX 3000 to the 4060 Ti.  I found one comparison that said this:

The Nvidia Quadro RTX 3000 for laptops is a professional high-end graphics card for big and powerful laptops and mobile workstations. It is based on the same TU106 chip as the consumer GeForce RTX 2070 (mobile) but with reduced shaders, clock speeds and memory bandwidth.

 

But this is 4 - 5 YEARS ago when the Geforce 2000 series was current.  This isn't a Quadro Max-Q or anything.

 

Is there ANY way the lower-mid range Quadro RTX of 2018-2019 should be faster than a 4060 Ti at running anything, much less Lightroom Denoise?  I don't recall ever seeing anything from Adobe that a generations old Quadro would out-perform a current mid-range GPU, but here are the specs I found...

 

Quadro RTX 3000                                                   4060 Ti
  • Bus Interface: PCIe 3.0 x16.                            PCIe 4.0 (I presume X 16)
  • Max Memory Size: 6144 MB.                        8GB DDR6
  • Core Clock(s): 945 MHz.                                2310 Mhz
  • Memory Clock(s): 1750 MHz.                        2250 Mhz
  • DirectX: 12.0.                                                    12 Ultimate
  • OpenGL: 4.6.                                                    4.6
  • Max TDP: 80 W.                                               165 W

 

Are there a lot of Adobe users buying old generations of Quadro processors for their laptops?

 

And is there any chance that switching to the Studio drivers for this will make a difference?

 

 


Thanks for the replies.  Eventually, they forced me to think and pushed me in the right direction...

OK, I THINK I have a fix... 

The 4060 has both "game" drivers and "studio" drivers.  When I installed it I looked at the differences and everything I read said there was NO advantage to the studio drivers, and X or Y or Z might be slower.  So, I put the "game ready" drivers on.

Yesterday I downloaded the newest set of Studio drivers, which it appears have been updated a couple times in a very short period, and installed them using a "clean install"...

I say I MIGHT have a fix because the installation required a reboot, which I DON'T think had any impact since I'd rebooted the desktop several times while testing and the Denoise never got any faster - always too just under 2 minutes.

Took the same image and ran Denoise again, and the time is in the 10-14 second range depending on whether I wait for the image to show up in Lightroom (10) or until the progress bar, which has been at 100% for several seconds, finally goes away (14)...  Either way, it's a lot faster than almost 2 minutes.  A FULL-SIZED A1 image (51 megapixels) took 28 seconds as opposed to 4 minutes and 20 seconds, OR 40+ seconds on the laptop.

Looking back with my 20/20 hindsight, instead of initially re-installing the "game ready" drivers, I should have switched to the Studio drivers, but didn't think of it...

JohanElzenga
Community Expert
Community Expert
December 31, 2023

"I found an old discussion from April where the answer kept being "get a faster CPU".  In my case, the faster CPU takes longer, and neither computer is even using the cpu for Denoise"

No idea what discussions you are referring to, but it is clearly wrong. Denoise is very GPU intensive, not CPU intensive. As you noticed.

 

-- Johan W. Elzenga