• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
5

Denoise speed is the reverse of what I expected...

Enthusiast ,
Dec 31, 2023 Dec 31, 2023

Copy link to clipboard

Copied

This came up because I need to do a presentation on some of the new tools in a week, and wanted to make sure the laptop was up to the challenge of doing Denoise and Gen Fill and Gen Expand and so on.

 

I found an old discussion from April where the answer kept being "get a faster CPU".  In my case, the faster CPU takes longer, and neither computer is even using the cpu for Denoise.

 

Desktop is i9-12900, 64GB, NVidia 4060 Ti 8GB, laptop is a 4-year old Dell 7740 with an i9-9880, 64GB, and a Radeon RTX 3000 6GB.  Both systems are Windows 11 Pro running 23H2.  Both systems have a 25GB cache and everything is living on NVMe SSDs on both systems.

 

Same image on both systems, in the same LR catalog (copied the whole catalog from desktop to laptop.)  Image is from an Alpha 1 in crop sensor mode, so it's about 20 megapixels.  Small.

I ran LR Denoise on the image on the desktop.  It consistently takes 1:53 +/- about 2 seconds - close to 2 minutes.

I ran LR Denoise on the image on the laptop.  It consistently takes 15 seconds +/- about 2 seconds.

 

The results from both look the same and while processing both systems ACT the same.  On the desktop the GPU goes to 94-98% busy for the entire time and the CPU sits at around 3%.  On the laptop the GPU goes to 95-99% busy and the CPU sits at around 5-6% the entire time.

 

On the desktop I turned OFF the GPU, restarted and got a popup telling me things would faster if I turned the GPU on, and I pressed CANCEL!  Ran Denoise on the desktop and the GPU STILL went to 95-98% busy.

 

Everything else in Lightroom (and Photoshop) that I've tried that takes long enough to measure is considerably faster on the desktop - Generative fill doing the same thing on the same image is always 30-40% faster on the desktop.  LR Photo Merge is faster on the desktop.  LR Select Subject (mask) is 3.1 +/- .2 on the desktop with the GPU OFF, and 5.3 +/- .2 on the laptop.   With the GPU ON and Lightroom restarted, the Select Subject is about 1.4 seconds, so significantly faster, as is my pereception of most things when I do a gross comparison of desktop to laptop.

 

The one anomoly is Denoise speed. 

 

TOPICS
Windows

Views

1.1K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Dec 31, 2023 Dec 31, 2023

Copy link to clipboard

Copied

"I found an old discussion from April where the answer kept being "get a faster CPU".  In my case, the faster CPU takes longer, and neither computer is even using the cpu for Denoise"

No idea what discussions you are referring to, but it is clearly wrong. Denoise is very GPU intensive, not CPU intensive. As you noticed.

 

-- Johan W. Elzenga

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Dec 31, 2023 Dec 31, 2023

Copy link to clipboard

Copied

quote

I found an old discussion from April where the answer kept being "get a faster CPU".

By @DavePinMinn

 

If you have the link to that discussion, it would be interesting to review. Just about every other discussion about AI Denoise since it was released has talked about how critically important it is to have a recent, powerful GPU, and how unimportant the CPU is.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
Dec 31, 2023 Dec 31, 2023

Copy link to clipboard

Copied

I have no idea how to point y'all at the discussion, but it's "Lightroom Denoise is Remarkably slow" in THIS forum.  It may have been that one that got dozens of replies with "just like everybody else that's complaining about Denoise, you need a faster CPU", OR it may have been one of the other ones I found........

 

In general, if I can find a clue ANYWHERE, I'd rather do that than have to ask a question in here.  Not that y'all aren't great, but my questions are usually small things that are annoyances rather than disasters.

 

But, other than the sidetrip, does anybody have a useful idea why a 4-year-old, midrange laptop is running Denoise 8 TIMES faster than my desktop?  Is there something magical about an elderly Radeon RTX 3000 from 2019 that it can do Denoise faster than an NVidia 4060?  That would be depressing...

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Jan 01, 2024 Jan 01, 2024

Copy link to clipboard

Copied

You desktop computer may not be using the GPU (typically due to a driver issue). You can check whether the computer is using 'Full Acceleration' by opening LrC Preference > Performance tab. The text below the GPU name/type should be as shown in attached screenshot (I.e. Your system automatically supports full acceleration).

 

full acceleration.png

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Jan 01, 2024 Jan 01, 2024

Copy link to clipboard

Copied

"I have no idea how to point y'all at the discussion"

 

Just paste the link. 

 

Link to this thread: https://community.adobe.com/t5/lightroom-classic-discussions/denoise-speed-is-the-reverse-of-what-i-...

 

 

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Jan 01, 2024 Jan 01, 2024

Copy link to clipboard

Copied

Is it this one?

 

https://community.adobe.com/t5/lightroom-classic-discussions/denoise-takes-several-minutes-to-comple...

 

(It's from April, and it has DJ Paige using the phrase "remarkably slow.")

 

I've been right through it. Nobody's blaming CPU for the problem - it's all about the GPU, as expected.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
Jan 01, 2024 Jan 01, 2024

Copy link to clipboard

Copied

As the original post says - the GPU is essentially 100% busy the entire time.  As it also is on the laptop.  The only difference is that the laptop appears to execute Denoise a whole lot faster than the far newer, far faster desktop.

I've tried several files on the laptop, including 1 full-sized Alpha 1 RAW, and even that 51 megapixel image only took 42 seconds.  On the desktop, the same image was 4:20, about six times as long...

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Jan 01, 2024 Jan 01, 2024

Copy link to clipboard

Copied

So that is the thread, then?

 

Where is this happening? "the answer kept being "get a faster CPU"."

 

And isn't "GPU working hard" surely the preferred situation? There has never been any ambiguity about the fact that the AI functions like Denoise are extremely GPU intensive.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Jan 01, 2024 Jan 01, 2024

Copy link to clipboard

Copied

quote

As the original post says - the GPU is essentially 100% busy the entire time.  As it also is on the laptop.  The only difference is that the laptop appears to execute Denoise a whole lot faster than the far newer, far faster desktop.

I've tried several files on the laptop, including 1 full-sized Alpha 1 RAW, and even that 51 megapixel image only took 42 seconds.  On the desktop, the same image was 4:20, about six times as long...


By @DavePinMinn


So that proves that it's indeed the GPU that is doing most, if not all, of the work. That 'get a faster cpu' is something only you seem to have found in these forums. I certainly would have reacted if I had ever seen it, because it is nonsense.

 

So why is your laptop faster for denoise than your desktop? I am no expert in GPU's, but when I entered your GPU's in Google I found this. "The Nvidia Quadro RTX 3000 for laptops is a professional high-end graphics card for big and powerful laptops and mobile workstations." and "The Nvidia GeForce RTX 4060 Ti is a decent mid range graphics card that excels at 1080p gaming, but can also pull off 1440p using DLSS 3. It offers an appropriate performance uplift over and above the vanilla RTX 4060, but it'll cost you $100 more and doesn't completely blow its lower-spec sibling out of the water."

 

That sounds to me like your laptop GPU may indeed be more powerful than your desktop GPU, despite the desktop being newer...

 

-- Johan W. Elzenga

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
Jan 01, 2024 Jan 01, 2024

Copy link to clipboard

Copied

I'm not an expert on GPUs either, and I've had trouble even finding anything that compares the Quadro RTX 3000 to the 4060 Ti.  I found one comparison that said this:

The Nvidia Quadro RTX 3000 for laptops is a professional high-end graphics card for big and powerful laptops and mobile workstations. It is based on the same TU106 chip as the consumer GeForce RTX 2070 (mobile) but with reduced shaders, clock speeds and memory bandwidth.

 

But this is 4 - 5 YEARS ago when the Geforce 2000 series was current.  This isn't a Quadro Max-Q or anything.

 

Is there ANY way the lower-mid range Quadro RTX of 2018-2019 should be faster than a 4060 Ti at running anything, much less Lightroom Denoise?  I don't recall ever seeing anything from Adobe that a generations old Quadro would out-perform a current mid-range GPU, but here are the specs I found...

 

Quadro RTX 3000                                                   4060 Ti
  • Bus Interface: PCIe 3.0 x16.                            PCIe 4.0 (I presume X 16)
  • Max Memory Size: 6144 MB.                        8GB DDR6
  • Core Clock(s): 945 MHz.                                2310 Mhz
  • Memory Clock(s): 1750 MHz.                        2250 Mhz
  • DirectX: 12.0.                                                    12 Ultimate
  • OpenGL: 4.6.                                                    4.6
  • Max TDP: 80 W.                                               165 W

 

Are there a lot of Adobe users buying old generations of Quadro processors for their laptops?

 

And is there any chance that switching to the Studio drivers for this will make a difference?

 

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
Jan 03, 2024 Jan 03, 2024

Copy link to clipboard

Copied

Thanks for the replies.  Eventually, they forced me to think and pushed me in the right direction...

OK, I THINK I have a fix... 

The 4060 has both "game" drivers and "studio" drivers.  When I installed it I looked at the differences and everything I read said there was NO advantage to the studio drivers, and X or Y or Z might be slower.  So, I put the "game ready" drivers on.

Yesterday I downloaded the newest set of Studio drivers, which it appears have been updated a couple times in a very short period, and installed them using a "clean install"...

I say I MIGHT have a fix because the installation required a reboot, which I DON'T think had any impact since I'd rebooted the desktop several times while testing and the Denoise never got any faster - always too just under 2 minutes.

Took the same image and ran Denoise again, and the time is in the 10-14 second range depending on whether I wait for the image to show up in Lightroom (10) or until the progress bar, which has been at 100% for several seconds, finally goes away (14)...  Either way, it's a lot faster than almost 2 minutes.  A FULL-SIZED A1 image (51 megapixels) took 28 seconds as opposed to 4 minutes and 20 seconds, OR 40+ seconds on the laptop.

Looking back with my 20/20 hindsight, instead of initially re-installing the "game ready" drivers, I should have switched to the Studio drivers, but didn't think of it...

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Jan 03, 2024 Jan 03, 2024

Copy link to clipboard

Copied

Good detective work. That was the next suggestion I was going to have, but you were ahead of the game:

 

I was starting to think that if a powerful new GPU like a 4060 is mysteriously slower than an older GPU, then maybe the explanation was not that the old one is actually faster, but that the new one was being held back in some way. It has been discussed in these forums before that wrong or outdated drivers are a common cause of poor GPU performance, and now it looks like your problem was related to that. There have been several threads here where people were advised to fix a problem by installing the non-game versions of GPU drivers.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Jan 03, 2024 Jan 03, 2024

Copy link to clipboard

Copied

quote

A FULL-SIZED A1 image (51 megapixels) took 28 seconds


By @DavePinMinn

 

You're clearly on the right track, but the 4060 Ti is still a bit slower than it should be. I have that GPU, and it does 60 MP from an a7rV in around 24 seconds. But we're probably not using the same (studio) driver, I haven't updated it in a couple of months.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
Jan 03, 2024 Jan 03, 2024

Copy link to clipboard

Copied

I THINK, if I remember correctly, one of the issues with the Studio drivers was they can make programs like Word and Outlook behave oddly.  Like typing a letter and not having the letter show and the cursor move, the cursor stays there 'til you type again.  I haven't seen that yet, but I haven't been doing any Word documents today.

I have a topic over in the NVidia forum to see if there's any way to adjust or tune the GPU, or if there's anything else I can do to make sure the thing is working optimally with the current drivers.

As far as numbers for Denoise, at this point I"m just happy it's not 4 minutes any more, but your system that's doing it in 24 seconds is great. 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
Jan 05, 2024 Jan 05, 2024

Copy link to clipboard

Copied

LATEST

I spoke too soon..........

Yesterday I did some fiddling and running different things on different images.  And things acted normal.

This morning I took one of the small RAW images "A" and did some other fiddling, like putting an adaptive preset on it.  Then ran denoise.  And instead of being around 12 seconds it was 27.  Tried again, and still 27.  Switched to a DIFFERENT image "B", ran denoise and it was 12.

Put an adaptive preset on that image "B" and the time jumpted to 27 seconds.

Took that off and did a Select Subject and Select Sky mask and it jumped to 38 seconds.

Purged cache, no difference.

Shut down Lightroom and restarted it.  No difference.

Switched to a different catalog and tried an image there "C".  Worked fine.

Restarted the original catalog and ran the same image "A" as originally - still slow.  "B" also still slow. 

Reinstalled the NVidia drivers.  No difference.

No change.

Emptied the temp folder.

No change.

 

It appears that once an image gets slow at running Denoise, it's going to STAY slow until a reboot.  It's not back to 2 minutes, but it's bizarre.

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines