Skip to main content
Participating Frequently
May 27, 2020
Question

Ligtroom doesn't use my GPU?

  • May 27, 2020
  • 11 replies
  • 1512 views

Hello,

I've started to use the "enhance details" feature more often with my Fuji X files to increase detail (or fix the X-trans interpretation really) of the images for a client.

Noticed that my PC fans started working like crazy, installed NZXT CAM, a hardware monitoring app and saw that my CPU was doing the work alone, reaching between 90º to 98º C in the process. The GPU wasn't doing anything.

Thought it was weird and saw that indeed the GPU option in LR preferences is set to display only, the other option to use it for processing is greyed out. MY graphics card is a GTX 770, isn't really top-of-the game I know, but should be able to give a good boost to LR.

I use another processing-intensive task, the Topaz Denoise AI plugin, which uses my GPU and I can see on the monitor app that the task is divided and the CPU don't get so overloaded.

So is my GPU not really supported or is this a bug? If it is, any fix for this?

Using Windows 10 64bit, Intel i7-4770 3.40GHz CPU, 16GB RAM, Nvidia GTX 770 2GB VRAM
Nvidia drivers up-to-date.

This topic has been closed for replies.

11 replies

Known Participant
June 1, 2020

I exported about 785 pictures, here is a screenshot of the resource monitor and task manager, hope it explains.

 

GoldingD
Legend
May 27, 2020

If your rig is a laptop, the following might increase power usage so high that fans kick in, might be a thing to not do. A desktop should not have issues. These are 2 tips for performance

 

First up for NVIDIA GPU's

https://www.winhelp.info/boost-lightroom-performance-on-systems-with-nvidia-graphics-chip.html

note that the option to use Graphiccs  Processor may behave better after the above.

 

Next up, see tip 8 in the following

https://au.pcmag.com/windows-10-1/5180/11-tips-to-speed-up-windows-10

 

 

GoldingD
Legend
May 27, 2020

Is this a desktop computer? GPU implies it is, your mention if fans ramping up sounds more like laptop behavior.

 

Have you checked for dust bunny's in your computer, around CPU, around GPU, around RAM. Have you cleaned lint from fan inlets/outlets. Have you cleaned lint/dust at PSU inlet?

 

GoldingD
Legend
May 27, 2020
Displays: 1) 1920x1200

 

At less than 4K, the option to use Graphics Processor may accomplish nothing and may cause odd issues.

 

And, yes your GPU is very old and limited as far as Lightroom Classic is concerned. I suspect it only presents you with Basic acceleration, not Advanced, as such probably accomplishing nothing for Enhance Details.

 

Participating Frequently
June 4, 2020

Only now I could continue to respond to this thread, sorry.

 

Yes, I want to stay away from Windows Updates until there's reports of all the problems being fixed.

Updated Nvidia drivers and turned the settings to "maximum performance", still cant use the GPU on LR.

What I find odd is that my card is listed as a supported GPU on LR, and ir may not be top of the line but still more than capable of giving a huge boost to any graphics aplication. A good example of this is the Topaz Denoise AI plugin that I use, very intensive processing but my GPU gives it a major boost. So if LR can't really use my GPU I just see it as bad programming and support.

Regarding cleaning the dust, I've done that, and the fans don't go all that crazy on other intesive apps or games.

GoldingD
Legend
May 27, 2020

Enhanced Details is a very very resource use intensive process. Not at all suprised it would spike CPU usage

 

DdeGannes
Community Expert
Community Expert
May 27, 2020

I certainly agree with you here David, I downloaded some sample raw files from the new X-T4 and doing Enhance Details on one of the files took a full 8 seconds to render and display. I have not been able to login to the site for over 6 hours.

 

Regards, Denis: iMac 27” mid-2015, macOS 11.7.10 Big Sur; 2TB SSD, 24 GB Ram, GPU 2 GB; LrC 12.5,; Lr 6.5, PS 24.7,; ACR 15.5,; (also Laptop Win 11, ver 24H2, LrC 15.0.1, PS 27.0; ) Camera Oly OM-D E-M1.
GoldingD
Legend
May 27, 2020

Warning, MS update. apparently the new Windiws 10 update is causing issues, you may want to not accomplish that for aa couple of weeks

 

see: https://www.bleepingcomputer.com/news/microsoft/microsoft-is-investigating-ten-windows-10-2004-known-issues/

 

 

 

GoldingD
Legend
May 27, 2020

Your GPU driver needs to be updated. A new one came out today. Mind you, the one you have is not to old. Also the new one out today may not fix anything for Lightroom Classic. I suspect the new one released today is to support the brand new Windows 10 OS update released today.

 

You do bit currently accomplish this thru MS Updates.

 

You should have an NVIDIA utility app called GeForce Experience, use it to inspect your GPU driver for an update, select a Custom install as to force a clean install.

 

Ian Lyons
Community Expert
Community Expert
May 27, 2020

Please read https://helpx.adobe.com/lightroom-classic/kb/lightroom-gpu-faq.html#Troubleshoot Also, ensure that DirectX 12 is installed and fully functional on your computer

Known Participant
May 27, 2020

This does not apply to windows 10. 

because W10 already comes with DX12 as a standard

 

 

DdeGannes
Community Expert
Community Expert
May 27, 2020

You may wish to check and see if your NVIDIA card has been uptdated Windows 10.

See the screen capture of the Specification for your card on the manufacturers web site. Also have a look at Display Support

 

Regards, Denis: iMac 27” mid-2015, macOS 11.7.10 Big Sur; 2TB SSD, 24 GB Ram, GPU 2 GB; LrC 12.5,; Lr 6.5, PS 24.7,; ACR 15.5,; (also Laptop Win 11, ver 24H2, LrC 15.0.1, PS 27.0; ) Camera Oly OM-D E-M1.
Participating Frequently
May 27, 2020

Thanks DdeGannes

You mean to install the latest Windows 10 updates? 

I've actually beem avoiding it since there's numerous reports of issues with the latest ones. Could be from this?

Here a screenshot of the update history:

 

And a screenshot of the update unninstall list:


 

If this is the problem, man... will have to choose between risking frying my CPU or risking messing with the OS with possible data loss... 

Known Participant
May 27, 2020

Dear ricardoc87675909

if you do enable the GPU acceleration in the Preferences, it would only support some functions during the editing, very minor such as graduated filter, spot removal, etc. that too is speculated.

 

Adobe fails to hear the feedback of the consumers to use GPU for rendering these images out.

 

If I export about 100 images from RAW after the post-editing, LR almost fries up the CPU, the utilization is pushed to 100% whereas the GPU is sitting idle. 

I am not sure which part of the GPU does Adobe engineers not understand, Why can't they just offload the Processing to a GPU when it has the computing power to do, we use HighEnd Cards like a Quadro P2000.

 

They just keep taking the money and not listening to any consumer feedback.

Participating Frequently
May 27, 2020

Really? For how much time this is happening?
Weird, since Photoshop uses my GPU fine... It seems that it's only LR and ACR that can't.

Known Participant
May 27, 2020

The way - PS, uses the GPU is nothing as compared to any other appilcation such as - Premiere Pro, After effects.

But yes, for lR if the rendering, basically exporting can be also shifted to GPU, the performance will increase at exponential rate, sadly no one pays attention to the Forums. or our issues.

 

And even worse I am in India, so absolutely no regard as customer.