• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
43

P: Allow users to disable the GPU for AI masking

LEGEND ,
Oct 28, 2021 Oct 28, 2021

Copy link to clipboard

Copied

Update 10/14/2023: LR 13 (finally) provides a mechanism for disabling the use of the GPU for AI masking commands:

https://community.adobe.com/t5/lightroom-classic-ideas/p-allow-users-to-disable-the-gpu-for-ai-maski...

 

The AI masking commands should obey the setting of Preferences > Performance > Use Graphics Processor, allowing users to disable its use. That will let users on older, incompatible hardware continue to use AI masking (*), and it will greatly accelerate troubleshooting when users have incompatible GPUs or out-of-date drivers. In my quickie testing on Windows, these commands take 1-2 seconds with a GPU and 5-10 seconds without (so still quite useable). (+)

 

Both the CoreML library (Mac) and the ONNX Runtime (Windows) allow the client to specify whether the CPU or the GPU is used to execute models.

 

Providing the option Use GPU for AI Masking would be exactly in line with the other existing GPU options:

- Use GPU For Display
- Use GPU For Image Processing
- Use GPU For Export
- Use GPU To Apply Dehaze And Clarity Adjustments While Generating Previews

 

(*) LR 12 no longer lets users with less than 2 GB VRAM use People masking.  Older Macs, e.g. Mac Minis and Macbook Airs, have only 1.5 GB VRAM and can't be upgraded. Intel has an explicit "legacy drivers" policy, where they no longer fix most bugs in drivers for their older graphics hardware. The other manufacturers probably have such de facto policies as well.

 

People masking also fails for users with Imacs with Radeon 5700 graphics processors. There appears to be a bug in the graphics driver that AMD hasn't fixed or has fixed but Apple hasn't included it in Mac OS.

 

(+) It's easy to demonstrate that AI masking runs correctly but more slowly without a GPU.  On an Intel Mac, boot into Safe Mode and run LR -- Activity Monitor demonstrates there is no GPU available. On Windows, run a Parallels Virtual Machine, which won't use the GPU of the host computer.

Idea No status
TOPICS
macOS , Windows

Views

2.9K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
19 Comments
LEGEND ,
Oct 29, 2021 Oct 29, 2021

Copy link to clipboard

Copied

This would also allow people with older graphics chipsets that lack newer drivers to use Select Sky and Subject.

Votes

Translate

Translate

Report

Report
LEGEND ,
Oct 13, 2022 Oct 13, 2022

Copy link to clipboard

Copied

In another thread, Adobe employee @simonsaith wrote, "To use or not to use the GPU for ML inference for a particular GPU/driver version is not within Lr's control. The platform OS will do the check at runtime."

 

This is incorrect. LR uses the CoreML library on Mac and the ONNX Runtime on Windows, and both allow the client to control whether the GPU is used to execute a model.

 

Mac: MLModelConfiguration allows the client to limit execution to the CPU via "computeUnits".  

 

Win: onnxruntime.InferenceSession() allows the client to specify the "providers" option listing CPUExecutionProvider only.

 

 

Votes

Translate

Translate

Report

Report
Explorer ,
Oct 23, 2022 Oct 23, 2022

Copy link to clipboard

Copied

I agree entirely. Lots of people with otherwise perfectly serviceable systems don't have advanced graphics cards (which are more commonly bought for gaming than anything else). We can all appreciate the Lightroom functions which do make use of GPU acceleration but they should still work, even at a slower pace without them.

Votes

Translate

Translate

Report

Report
LEGEND ,
Nov 19, 2022 Nov 19, 2022

Copy link to clipboard

Copied

To show how technically simple it is to control whether AI models execute on the CPU or GPU, I wrote a simple Python script using the same Core ML library used by LR. It loads two instances of the open-source MobileNetV2 image-classification model, with one allowed to use the GPU and the other not. The script then alternates execution of the two models.  The key lines of the script are:

modelGPU = ct.models.MLModel ("MobileNetV2.mlmodel", 
    compute_units = ct.ComputeUnit.ALL)

modelCPU = ct.models.MLModel ("MobileNetV2.mlmodel", 
    compute_units = ct.ComputeUnit.CPU_ONLY)

 

A prediction using the GPU takes about 9 msecs, while a prediction using the CPU takes 39 msecs. When the GPU model is running, Activity Monitor shows % CPU at about 50% and % GPU at 50%. But when the CPU model is running, % CPU is about 175% and % GPU is 0%.

 

You can download the entire script and instructions from here:

https://www.dropbox.com/s/0edo6jm2s1sgrc6/coreml-gpu-cpu.2022.11.14.zip?dl=0 

 

That .zip includes a screen recording of the script in action.

 

 

 

Votes

Translate

Translate

Report

Report
LEGEND ,
Jan 02, 2023 Jan 02, 2023

Copy link to clipboard

Copied

People with three-year-old Macbooks may have only 1.5 GB of VRAM, and People masking is completely unavailable to them:

https://www.lightroomqueen.com/community/threads/people-mask-unavailable.46901/ 

 

I guess Adobe would rather these people don't have People masking at all rather than have it work slowly. See:

https://www.lightroomqueen.com/community/threads/people-mask-unavailable.46901/

Votes

Translate

Translate

Report

Report
Community Expert ,
Jan 02, 2023 Jan 02, 2023

Copy link to clipboard

Copied

Agreed this doesn't affect me but lots of people with supposedly subcritical hardware that is still very new on the forum that could be given a choice to risk crashing the program. Even if it gives a popup with "are you sure, your hardware might not support it and you might experience crashes" would be fine

Votes

Translate

Translate

Report

Report
LEGEND ,
Jan 02, 2023 Jan 02, 2023

Copy link to clipboard

Copied

"...could be given a choice to risk crashing the program. Even if it gives a popup with "are you sure, your hardware might not support it and you might experience crashes" would be fine"

 

I'm not aware of any LR crashes when AI masking is used on computers without a GPU, are you?   On many older computers, LR's own self-tests will automatically disable the use of the GPU for AI masking, and it still works correctly.  I do a large amount of testing, including with AI masking, on a Windows 10 virtual machine that doesn't have access to a GPU, and I've never experienced a crash.  

 

The CoreML library (Mac) and ONNX Runtime (Windows) used to execute the machine-learning models underlying AI masking are supposed to compute the same results on the CPU and GPU.

Votes

Translate

Translate

Report

Report
Community Expert ,
Jan 02, 2023 Jan 02, 2023

Copy link to clipboard

Copied

Ah. Misunderstood what you meant. Thought you meant we should be able to force GPU use. Yeah completely agree the AI functions should be available when it can only run on CPU too. There is no technical reason for them not too. Only performance.

Votes

Translate

Translate

Report

Report
Community Beginner ,
Jan 03, 2023 Jan 03, 2023

Copy link to clipboard

Copied

I am working on a 3 year old MacBook Pro and can't fully use the masking options - my GPU is an Intel Iris Plus Graphics 655 1536 MB and it is inadequate. So frustrating.

Votes

Translate

Translate

Report

Report
Community Beginner ,
Jan 09, 2023 Jan 09, 2023

Copy link to clipboard

Copied

I am working on a 2018 Macbook Pro ... I have 1.5GB of VRAM and was using Select People without any crashes .... Now it's grayed out ... this is very frustrating ... 

Votes

Translate

Translate

Report

Report
LEGEND ,
Jan 09, 2023 Jan 09, 2023

Copy link to clipboard

Copied

"I am working on a 2018 Macbook Pro ... I have 1.5GB of VRAM and was using Select People without any crashes .... Now it's grayed out ... this is very frustrating ..."

 

Agreed, Adobe has decided that your computer's graphics processor is incompatible with LR 12.1 and they stubbornly refuse to provide an option to disable its use in AI masking.  Very bizarre.

 

Your options are to roll back to LR 12.0.1 and never install future upgrades, or buy a new computer.

Votes

Translate

Translate

Report

Report
Community Beginner ,
Jan 09, 2023 Jan 09, 2023

Copy link to clipboard

Copied

At this juncture, buying a new computer is not in the cards .... I've been
working around this to a small degree, using the brush and various custom
selections ... Select Subject can work ... sometimes ...

--
*Charles Levin*
Fine Art Photography
www.charleslevinphoto.com
Instagram @charleslpics

*Times Remembered: The Final Years of the Bill Evans Trio* by Joe La
Barbera and Charles Levin
University of North Texas Press www.timesrememberedbook.com

Parkinson's Research Advocate
Connoisseur of Life
"*The secret of life is enjoying the passage of time.*" -- James Taylor

Votes

Translate

Translate

Report

Report
Advocate ,
Jan 09, 2023 Jan 09, 2023

Copy link to clipboard

Copied

I support this request.

I think we should be given a clear "Use CPU for Ai" option in the performance panel.

Said option would automatically be checked if the GPU is not supported for Ai (and in case one could check it if GPU performance is poor)

 

.

 

 

Votes

Translate

Translate

Report

Report
Community Beginner ,
Jan 10, 2023 Jan 10, 2023

Copy link to clipboard

Copied

Is anybody from Adobe going to respond to this? It's quite reasonable and would end a minor but significant frustration. 

Votes

Translate

Translate

Report

Report
LEGEND ,
Jan 10, 2023 Jan 10, 2023

Copy link to clipboard

Copied

"a minor but significant frustration"

 

Unfortunately, it's not so minor for some, in two common instances:]

 

- LR has decided that an older GPU is compatible with AI masking, but the manufacturer has stopped provided updates to their graphics drivers (e.g. Intel).  So the user can't use AI masking at all.

 

- LR thinks the GPU is compatible with AI masking but then decides that there isn't enough memory for People masking. So the user can use other AI masking but not People masking.  This is common with three- and four-year-old Macbook Airs that have 1.5 GB of VRAM.

 

If the user has a desktop computer, they can upgrade the GPU for a couple hundred dollars, but not if they have a laptop.

Votes

Translate

Translate

Report

Report
Explorer ,
Feb 23, 2023 Feb 23, 2023

Copy link to clipboard

Copied

I have a 2020 MBP fully maxed out on CPU and RAM. I couldnt buy more Vram, it was not an option. So the best I could buy 3 years ago is already not good enough. I suspect the problem can be resolved with some thought. I'm now worried that each update will lazily make my expensive computer redundant.

Votes

Translate

Translate

Report

Report
Explorer ,
Feb 27, 2023 Feb 27, 2023

Copy link to clipboard

Copied

I have a 2020 MacBook Pro 13". I bought the highest spec processor and RAM. There was no option to extend the VRAM. At barely 3 years old it does not meet the minimum spec. I think that is unacceptable.

Votes

Translate

Translate

Report

Report
New Here ,
May 28, 2023 May 28, 2023

Copy link to clipboard

Copied

Maybe Adobe wants to exclude those of us who have a 3 year old macbooks thus we have to look elsewhere for editing our photos. Bye 👋

Votes

Translate

Translate

Report

Report
LEGEND ,
Oct 14, 2023 Oct 14, 2023

Copy link to clipboard

Copied

LATEST

LR 13 (finally) provides a mechanism for disabling the use of the GPU for AI commands:

https://helpx.adobe.com/camera-raw/kb/acr-gpu-faq.html#lens-blur

 

Though the help article implies this will disable the use of the GPU for Denoise, Lens Blur, and Masking, my testing indicates it only works for AI masking, not Denoise or Lens Blur (tested on LR 13.0.1 / Mac OS 13.5.1 / Apple M2 Max).

Votes

Translate

Translate

Report

Report