Skip to main content
Participant
August 13, 2023
Question

People detection GPU accelerated?

  • August 13, 2023
  • 2 replies
  • 288 views

I've been using the AI base "People Detection" extensively, it's such a useful feature that makes adjusting photos much easiler, especially for doing portait editing.

However, I've noticed this feature is not powered by the GPU, as I've tested the same feature on a M1 Macbook Air (8GB RAM) and a Windows setup with Ryzen 5 3600 CPU + 32GB RAM + RTX 4060TI GPU.

The later configuration is having far more GPU computing power than the M1 Macbook Air, but the detection speed on MacBook Air is around 3 seconds, and running on the Windows box, it takes around 5 seconds, propotional to the CPU horse power of repective CPUs. So I'm very confident that the "People Detection" is using barely the CPU, and ignores the more powerful GPU available.

If doing it right, the GPU should provide order of magnitude faster speed for this feature, which makes the detection latency negligiible. This is a day and night difference in user experience, so please add the GPU support for "People Detection", a lot of people would appriciate your efforts, sincerely.

This topic has been closed for replies.

2 replies

johnrellis
Genius
August 13, 2023

I agree with Jao's comments. The AI models used by LR run particularly well on the M1/M2 processors; for example, see this crowd-sourced benchmark for LR Denoise:

https://www.lightroomqueen.com/community/threads/my-recent-gpu-experience-gigabyte-nvidia-rtx-4070-ti-12gb.47572/page-2#post-1315545 

 

The evidence that LR People masking uses the GPU includes:

 

1. Observe LR using Task Manager > Performance (Windows) and Activity Monitor > Window > GPU History (Mac) (other views in those apps also provide GPU usage information).

 

2. Start Windows in Safe Mode With Networking, which disables the GPU for use by apps.  The AI masking commands run 3-5x slower on the CPU than a typical GPU.

 

3. There are many, many reports in this forum of People masking failing due to out-of-date buggy graphics drivers -- updating the drivers corrects the problems.

 

4. This Adobe help article details problems with People masking on GPUs with less than 2 GB graphics memory:

https://helpx.adobe.com/camera-raw/kb/select-people-mask-crash.html 

 

 

 

Community Expert
August 13, 2023

You're underestimating how good the GPU and machine learning cores are on Apple Silicon even on a base M1. The people masks in the AI masking are definitely done on the GPU cores. Also note that you really cannot compare these machines just based on frequency. What matters is the number of cores, the efficiency of them (i.e. how specialized are they for the task) and the bandwidth they have to memory. The M1 will be faster in some tasks and slower in others.

victlAuthor
Participant
August 15, 2023

Thank you for pointing out this to me. After reading the benchmark provided by @johnrellis , I realized LrC probably indeed used the GPU.

But I think there are still room for improvement. Several seconds for a segmentation task on a 24 mega pixel image sounds unreasonable to me. I know of some segmentation AI model that can pixel-wise segment a 2 mega pixel image within 0.02s on a RTX 3060. If the talents at Adobe can optimize the people mask to get the result with 1 second, it would be a huge difference in user experience.