Skip to main content
Participant
November 11, 2025
Answered

Choosing the Right CPU/GPU to speed up AI functions in LRC

  • November 11, 2025
  • 2 replies
  • 408 views

Hi all,

I'm using a four-year-old laptop with an 11th-generation Intel CPU for LRC. The AI-driven features, especially noise cancellation, are incredibly slow. I've decided to buy a new, future-proof laptop, but what CPU/GPU should I get? LRC used to not use a GPU in the past, but now it does for AI-powered features. But some of the latest processors, like the Intel Core Ultra 7 258V, support AI without graphics acceleration.
I'm curious what are your experiences which is the best solution? Does anyone uses Intel Core Ultra 7 258V with LRC? Could it be enough for AI supported functions? Or go for a good CPU/GPU like the Asus ProArt 16 which has AMD AI 9 HX 370 + RTX5080? Or should I switch to MacBook Pro which is also could be case.

 

Regards,

Jani

Correct answer Sameer K

Hey, @János36437550ltsr. Welcome to the Lightroom Classic Community. I'll help you figure this out. I won't specify what brand or platform you should pick. 

 

I'd like to point out a few things about Lightroom's Denoise, and most other AI models currently don't utilize the NPUs for accelerating AI workflows on Windows devices. As @dj_paige shared, a separate, or discrete, GPU is well accepted and tested by our power users on Windows. For the Mac path, you may wait for our experts to chime in. 


Sameer K
(Type '@' and type my name to mention me when you reply)

2 replies

Sameer K
Community Manager
Sameer KCommunity ManagerCorrect answer
Community Manager
November 11, 2025

Hey, @János36437550ltsr. Welcome to the Lightroom Classic Community. I'll help you figure this out. I won't specify what brand or platform you should pick. 

 

I'd like to point out a few things about Lightroom's Denoise, and most other AI models currently don't utilize the NPUs for accelerating AI workflows on Windows devices. As @dj_paige shared, a separate, or discrete, GPU is well accepted and tested by our power users on Windows. For the Mac path, you may wait for our experts to chime in. 


Sameer K
(Type '@' and type my name to mention me when you reply)

dj_paige
Legend
November 11, 2025

@Sameer K 

what is an NPU? And why is it important? (Or did you mean GPU?)

Participant
November 12, 2025

Neural Processing Unit. An on chip cooprocessor like separated unit to accelerate AI and machine learning tasks. 

dj_paige
Legend
November 11, 2025

Either a good Mac or good Windows system would be appropriate here. Both will have sufficient speed to perform the AI tasks. Let's not get into Mac vs Windows debates. On the Windows side, I would get a relatively new Ryzen 7 or Ryzen 9 or Intel i9, and a separate GPU such as the Nvidia RTX 40XX or 50XX ought to work. On the Mac side, the M-3 or M-4 chips should do the job.

 

To "future proof" the system (if such a thing is possible) you want the best hardware you can afford. If you are on a tight budget, that's not going to allow any "future proof" and will provide mediocre performance.

Participant
November 12, 2025

Hi @dj_paige ,

The Windows side is clear now. Unfortunately I don't have my own experience with Mac but really curious how it performs with LR AI tasks. I'm considering to get a MacBook Pro M5 for LR and PS use and keep my Windows laptop for the rest of my work  (I'm an IT Data Architect). Maybe a Mac with M5 socket could be a bit more "future proof" than a Windows laptop.

 

Cheers,

János