Skip to main content
johnrellis
Genius
October 28, 2021
Open for Voting

P: Allow users to disable the GPU for AI masking

  • October 28, 2021
  • 24 replies
  • 5521 views

Update 6/23/2024: LR 13.2 broke the mechanism for disabling the use of the GPU by AI commands:

https://community.adobe.com/t5/lightroom-classic-ideas/p-allow-users-to-disable-the-gpu-for-ai-masking/idc-p/14698393#M22221

 

Update 10/14/2023: LR 13 (finally) provides a mechanism for disabling the use of the GPU for AI masking commands:

https://community.adobe.com/t5/lightroom-classic-ideas/p-allow-users-to-disable-the-gpu-for-ai-masking/idc-p/14158013#M20035

 

The AI masking commands should obey the setting of Preferences > Performance > Use Graphics Processor, allowing users to disable its use. That will let users on older, incompatible hardware continue to use AI masking (*), and it will greatly accelerate troubleshooting when users have incompatible GPUs or out-of-date drivers. In my quickie testing on Windows, these commands take 1-2 seconds with a GPU and 5-10 seconds without (so still quite useable). (+)

 

Both the CoreML library (Mac) and the ONNX Runtime (Windows) allow the client to specify whether the CPU or the GPU is used to execute models.

 

Providing the option Use GPU for AI Masking would be exactly in line with the other existing GPU options:

- Use GPU For Display
- Use GPU For Image Processing
- Use GPU For Export
- Use GPU To Apply Dehaze And Clarity Adjustments While Generating Previews

 

(*) LR 12 no longer lets users with less than 2 GB VRAM use People masking.  Older Macs, e.g. Mac Minis and Macbook Airs, have only 1.5 GB VRAM and can't be upgraded. Intel has an explicit "legacy drivers" policy, where they no longer fix most bugs in drivers for their older graphics hardware. The other manufacturers probably have such de facto policies as well.

 

People masking also fails for users with Imacs with Radeon 5700 graphics processors. There appears to be a bug in the graphics driver that AMD hasn't fixed or has fixed but Apple hasn't included it in Mac OS.

 

(+) It's easy to demonstrate that AI masking runs correctly but more slowly without a GPU.  On an Intel Mac, boot into Safe Mode and run LR -- Activity Monitor demonstrates there is no GPU available. On Windows, run a Parallels Virtual Machine, which won't use the GPU of the host computer.

24 replies

johnrellis
Genius
August 19, 2025

"Denoise takes a long time, almost 10 minutes. I tried disabling GPU acceleration etc, but Lightroom still uses the integrated GPU."

 

Adobe inexplicably refuses to have LR's AI commands obey the setting of Preferences > Performance > Use Graphics Processor. The mechanism described here works for some of LR's AI commands:

https://helpx.adobe.com/camera-raw/kb/acr-gpu-faq.html#lens-blur

 

but it doesn't work for Denoise (I retested just now) and some other of LR's AI commands.

Participant
August 19, 2025

Is this still a problem? I have a desktop computer with a Ryzen 9 9950X 16 core processor and I am using the integrated graphics prosessor of the CPU. So, I have all the CPU power but no GPU power 🙂

This setup is fine for me for everything else except Lightroom. Denoise takes a long time, almost 10 minutes. I tried disabling GPU acceleration etc, but Lightroom still uses the integrated GPU. 

My old laptop denoises images faster with the CPU than my desktop with the iGPU. 

Participating Frequently
June 17, 2025

Here is my solution (many thanks to Noell Orridge from Adobe).

Adjust GPU Acceleration: In ACR Preferences, try unchecking Use GPU for image processing

 

Now it works perfectly. Here are my parameters:

Asus ExpertBook, Intel 255H with 140T iGPU

All drivers up to date (June 2025)

Test data: 45 MPixel, ISO 6400 with a Canon R5

Test performed three times with five images Bridge/Camera Raw AI Denoise from different image series

Battery operation: 100 to 120 seconds/image

Power connection: 72 to 76 seconds/image

Many thanks for this great tip.

Participating Frequently
June 16, 2025

is there now perhaps an other option to disable the gpu than DisableGPUInference.txt ?

johnrellis
Genius
June 23, 2024

Unfortunately, LR 13.2 broke the mechanism for disabling use of the GPU by LR's AI commands:

https://community.adobe.com/t5/lightroom-classic-discussions/help-instructions-for-disabling-use-of-gpu-by-ai-commands-don-t-work/m-p/14571912 

 

Adobe hasn't acknowledged this as a bug or corrected the Help article (it's either a bug or the Help article is wrong).

johnrellis
Genius
October 14, 2023

LR 13 (finally) provides a mechanism for disabling the use of the GPU for AI commands:

https://helpx.adobe.com/camera-raw/kb/acr-gpu-faq.html#lens-blur

 

Though the help article implies this will disable the use of the GPU for Denoise, Lens Blur, and Masking, my testing indicates it only works for AI masking, not Denoise or Lens Blur (tested on LR 13.0.1 / Mac OS 13.5.1 / Apple M2 Max).

Participant
May 28, 2023

Maybe Adobe wants to exclude those of us who have a 3 year old macbooks thus we have to look elsewhere for editing our photos. Bye 👋

Adesculpture
Known Participant
February 27, 2023

I have a 2020 MacBook Pro 13". I bought the highest spec processor and RAM. There was no option to extend the VRAM. At barely 3 years old it does not meet the minimum spec. I think that is unacceptable.

Adesculpture
Known Participant
February 23, 2023

I have a 2020 MBP fully maxed out on CPU and RAM. I couldnt buy more Vram, it was not an option. So the best I could buy 3 years ago is already not good enough. I suspect the problem can be resolved with some thought. I'm now worried that each update will lazily make my expensive computer redundant.

johnrellis
Genius
January 10, 2023

"a minor but significant frustration"

 

Unfortunately, it's not so minor for some, in two common instances:]

 

- LR has decided that an older GPU is compatible with AI masking, but the manufacturer has stopped provided updates to their graphics drivers (e.g. Intel).  So the user can't use AI masking at all.

 

- LR thinks the GPU is compatible with AI masking but then decides that there isn't enough memory for People masking. So the user can use other AI masking but not People masking.  This is common with three- and four-year-old Macbook Airs that have 1.5 GB of VRAM.

 

If the user has a desktop computer, they can upgrade the GPU for a couple hundred dollars, but not if they have a laptop.