Exit
  • Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
  • 한국 커뮤니티
5

AI denoise GPU vs CPU time issues

Community Beginner ,
Dec 06, 2023 Dec 06, 2023

Hi,

 I've read the threads about the time it takes to process AI denoise and I'm curious. I'm not a computer guy, maybe someone can help me understand. I have images from a Canon 5D mkIv and the estimated times on random photos range from 40minutes to 860 minutes. My machine has an Intel I7-3770K CPU @ 3.50Ghz and an Nvidia Quadro K4000 GPU (11 GB memory). I grabbed a photo with a time estimate of 38 minutes ( it took 20 minutes ) but I opened task manager and watched my component usage and my GPU usage was hovering around 2% but my CPU shot up to 100% ( Lightroom was using 98% ) .. All the threads I've read point to the GPU as the culprit for these long processing times but my task manager showed barely a blip on my GPU while my CPU was maxed out.  I would greatly appreciate any thoughts on this. My motherboard is ten years old and I'm not sure about just plugging in a more modern GPU when I don't fully understand what's happening.

 Thanks in advance for any help.

TOPICS
Windows
2.1K
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 1 Correct answer

LEGEND , Dec 07, 2023 Dec 07, 2023
Translate
Community Expert ,
Dec 06, 2023 Dec 06, 2023

I can see that happening if the GPU is not supported for Lightroom Classic/Camera Raw graphics acceleration. If the GPU can’t be used, it would have to push all the work to the CPU, overwhelming it. I looked up the release date of the Nvidia Quadro K4000 GPU, and it looks like it was released a decade ago. The CPU release date appears to be 11 years ago.

 

I’ll have to leave it to the more expert PC users to advise on putting a newer GPU on that motherboard, but given the age, yes, it’s likely that for it to be worth doing that, other motherboard components would also need to be updated to keep the entire system balanced and not bottleneck a current high performance GPU.

 

As an example (although they’re Macs), I have a 4-core Core i5 from 2018, and it denoises 60 megapixels in three minutes. On a 2021 Mac with an Apple Silicon processor, it’s just over a minute per 60MP image. The graphics hardware in the latest high end PCs and Macs do AI Denoise in under 20 seconds per image.

 

AI Denoise is one of those features where we have it only now because before about 3 years ago, the hardware to do it fast was not yet widely available. So almost everything more than a few years old is terrible at AI Denoise.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Dec 07, 2023 Dec 07, 2023

Thank you for your reply Conrad, I appreciate it.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Aug 01, 2024 Aug 01, 2024
LATEST

Please note that AI Denoise does not use the CPU, only the GPU.

GPU likely requires its thermal pad to be replaced. Why, the coolant has evaporated, hence the overheating. Plus there is likely a build up of dust which doesn't help. Also check memory hasn't been fried. Take it to a service agent.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
Dec 06, 2023 Dec 06, 2023
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Dec 07, 2023 Dec 07, 2023

Thank you for your help, I'm grateful for your time

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Dec 07, 2023 Dec 07, 2023

Your GPU only has 3gb, not 11 (ideally you want at least 8gb), and is otherwise pretty underspecced:

image_2023-12-07_133451931.png

(https://www.techpowerup.com/gpu-specs/quadro-k4000.c1841)

 

My AMD RX 580 is by no means a particularly impressive GPU, but it takes c. 45 seconds for a Canon R5 file.

 

Look at the difference in specs. Even if you aren't "a computer guy", more is better - especially around RAM and cores - where Lr is concerned:

image_2023-12-07_135538934.png

(https://www.techpowerup.com/gpu-specs/radeon-rx-580.c2938)

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Dec 07, 2023 Dec 07, 2023

Hi Keith, Thank you for pointing that out. I guess I'm confused by what my task manager shows. That shows 3.0GB dedicated GPU memory and 8.0GB Shared GPU memory with a total GPU memory of 11.0GB. I'm probably misinterpreting that information because like you've said, if I look up the spec sheet for my card it says 3GB. Thanks for taking the time to help.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Dec 07, 2023 Dec 07, 2023
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Dec 07, 2023 Dec 07, 2023

Hi,

 Thanks for the link, I have the most current driver installed ( 474.64 ) . What confuses me is that during processing I'm only seeing 2% of my GPU memory being used. But, after some help I've looked at the performance tab and see I can only tick the "use GPU for display" radio button... "use GPU for image processing" is greyed out. So, while my card meets the minimum specs, 3GB mem, direct x12, etc. and lightroom sees it, it apparently doesn't like what it sees 🙂 ...Thank you for taking the time to help.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Aug 01, 2024 Aug 01, 2024

This is a big problem, and it is your GPU as you said, it's overheating. What has likely happened is that your thermal pad inside the GPU has evaporated. So you've got no coolant. You need to get the thermal pad replaced, check the memory isn't fried and remove any build up of dust. Take it to a service agent.

 

Foregive me if you've seen this solution posted multiple times, but there are so many posts on this subject, I thought it might help if I did so inorder for it to be more easily found. Just trying to help. Sorry.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines