Copy link to clipboard
Copied
Hi,
I've read the threads about the time it takes to process AI denoise and I'm curious. I'm not a computer guy, maybe someone can help me understand. I have images from a Canon 5D mkIv and the estimated times on random photos range from 40minutes to 860 minutes. My machine has an Intel I7-3770K CPU @ 3.50Ghz and an Nvidia Quadro K4000 GPU (11 GB memory). I grabbed a photo with a time estimate of 38 minutes ( it took 20 minutes ) but I opened task manager and watched my component usage and my GPU usage was hovering around 2% but my CPU shot up to 100% ( Lightroom was using 98% ) .. All the threads I've read point to the GPU as the culprit for these long processing times but my task manager showed barely a blip on my GPU while my CPU was maxed out. I would greatly appreciate any thoughts on this. My motherboard is ten years old and I'm not sure about just plugging in a more modern GPU when I don't fully understand what's happening.
Thanks in advance for any help.
What is your GPU driver version?
and see:
Copy link to clipboard
Copied
I can see that happening if the GPU is not supported for Lightroom Classic/Camera Raw graphics acceleration. If the GPU can’t be used, it would have to push all the work to the CPU, overwhelming it. I looked up the release date of the Nvidia Quadro K4000 GPU, and it looks like it was released a decade ago. The CPU release date appears to be 11 years ago.
I’ll have to leave it to the more expert PC users to advise on putting a newer GPU on that motherboard, but given the age, yes, it’s likely that for it to be worth doing that, other motherboard components would also need to be updated to keep the entire system balanced and not bottleneck a current high performance GPU.
As an example (although they’re Macs), I have a 4-core Core i5 from 2018, and it denoises 60 megapixels in three minutes. On a 2021 Mac with an Apple Silicon processor, it’s just over a minute per 60MP image. The graphics hardware in the latest high end PCs and Macs do AI Denoise in under 20 seconds per image.
AI Denoise is one of those features where we have it only now because before about 3 years ago, the hardware to do it fast was not yet widely available. So almost everything more than a few years old is terrible at AI Denoise.
Copy link to clipboard
Copied
Thank you for your reply Conrad, I appreciate it.
Copy link to clipboard
Copied
Please note that AI Denoise does not use the CPU, only the GPU.
GPU likely requires its thermal pad to be replaced. Why, the coolant has evaporated, hence the overheating. Plus there is likely a build up of dust which doesn't help. Also check memory hasn't been fried. Take it to a service agent.
Copy link to clipboard
Copied
You may want to look at https://helpx.adobe.com/lightroom-cc/kb/lightroom-gpu-faq.html#sys-req-gpu-image-processing
Copy link to clipboard
Copied
Thank you for your help, I'm grateful for your time
Copy link to clipboard
Copied
Your GPU only has 3gb, not 11 (ideally you want at least 8gb), and is otherwise pretty underspecced:
(https://www.techpowerup.com/gpu-specs/quadro-k4000.c1841)
My AMD RX 580 is by no means a particularly impressive GPU, but it takes c. 45 seconds for a Canon R5 file.
Look at the difference in specs. Even if you aren't "a computer guy", more is better - especially around RAM and cores - where Lr is concerned:
Copy link to clipboard
Copied
Hi Keith, Thank you for pointing that out. I guess I'm confused by what my task manager shows. That shows 3.0GB dedicated GPU memory and 8.0GB Shared GPU memory with a total GPU memory of 11.0GB. I'm probably misinterpreting that information because like you've said, if I look up the spec sheet for my card it says 3GB. Thanks for taking the time to help.
Copy link to clipboard
Copied
What is your GPU driver version?
and see:
Copy link to clipboard
Copied
Hi,
Thanks for the link, I have the most current driver installed ( 474.64 ) . What confuses me is that during processing I'm only seeing 2% of my GPU memory being used. But, after some help I've looked at the performance tab and see I can only tick the "use GPU for display" radio button... "use GPU for image processing" is greyed out. So, while my card meets the minimum specs, 3GB mem, direct x12, etc. and lightroom sees it, it apparently doesn't like what it sees 🙂 ...Thank you for taking the time to help.
Copy link to clipboard
Copied
This is a big problem, and it is your GPU as you said, it's overheating. What has likely happened is that your thermal pad inside the GPU has evaporated. So you've got no coolant. You need to get the thermal pad replaced, check the memory isn't fried and remove any build up of dust. Take it to a service agent.
Foregive me if you've seen this solution posted multiple times, but there are so many posts on this subject, I thought it might help if I did so inorder for it to be more easily found. Just trying to help. Sorry.
Find more inspiration, events, and resources on the new Adobe Community
Explore Now