Update 6/23/2024: LR 13.2 broke the mechanism for disabling the use of the GPU by AI commands:
https://community.adobe.com/t5/lightroom-classic-ideas/p-allow-users-to-disable-the-gpu-for-ai-maski...
Update 10/14/2023: LR 13 (finally) provides a mechanism for disabling the use of the GPU for AI masking commands:
https://community.adobe.com/t5/lightroom-classic-ideas/p-allow-users-to-disable-the-gpu-for-ai-maski...
The AI masking commands should obey the setting of Preferences > Performance > Use Graphics Processor, allowing users to disable its use. That will let users on older, incompatible hardware continue to use AI masking (*), and it will greatly accelerate troubleshooting when users have incompatible GPUs or out-of-date drivers. In my quickie testing on Windows, these commands take 1-2 seconds with a GPU and 5-10 seconds without (so still quite useable). (+)
Both the CoreML library (Mac) and the ONNX Runtime (Windows) allow the client to specify whether the CPU or the GPU is used to execute models.
Providing the option Use GPU for AI Masking would be exactly in line with the other existing GPU options:
- Use GPU For Display
- Use GPU For Image Processing
- Use GPU For Export
- Use GPU To Apply Dehaze And Clarity Adjustments While Generating Previews
(*) LR 12 no longer lets users with less than 2 GB VRAM use People masking. Older Macs, e.g. Mac Minis and Macbook Airs, have only 1.5 GB VRAM and can't be upgraded. Intel has an explicit "legacy drivers" policy, where they no longer fix most bugs in drivers for their older graphics hardware. The other manufacturers probably have such de facto policies as well.
People masking also fails for users with Imacs with Radeon 5700 graphics processors. There appears to be a bug in the graphics driver that AMD hasn't fixed or has fixed but Apple hasn't included it in Mac OS.
(+) It's easy to demonstrate that AI masking runs correctly but more slowly without a GPU. On an Intel Mac, boot into Safe Mode and run LR -- Activity Monitor demonstrates there is no GPU available. On Windows, run a Parallels Virtual Machine, which won't use the GPU of the host computer.