Skip to main content
Participant
May 29, 2024
Question

Integrated ARC Graphics Performance in Lightroom/Lightroom Classic

  • May 29, 2024
  • 3 replies
  • 10321 views

Has anyone with a laptop running the new Intel Core Ultra 7 155H or 9 185H chips checked how lightroom performs on these new chips and can their integrated ARC graphics handle any GPU acceleration tasks in LR? Or it that something I still need a dedicated GPU for?

 

Been using lightroom on iPad OS and Android and while it works, the workflow is kind of clunky, slow and lacks features that the desktop lightroom programs have (like the AI Denoise feature) and lightroom moible doesn't seem to reconizse the image stacks I made on the desktop versions.

 

So I've been considering getting a laptop for photo editing (among other things) and would like to know if the if those new Core Ultra chips and their Integrated ARC Graphics are a good fit and and handle at least some of the GPU accerated tasks that would normually be offloaded to a battery sucking GPU.

3 replies

Inspiring
November 8, 2024

I've had my lenovo laptop running the ultra 9 185H with 32 gb of RAM since April. And i am now seriously considering shifting to a mac. It doesn't crash, but is annoyingly slow for something said to be good at running these adobe programmes with their new AI features. And since i don't game or do any heavy video work i'd prefer to be able to not have to have a computer that runs a dedicated GPU as that will easily get me up there in the somewhat same price:perfomance range.
And now with the benchmark reviews coming out on the m4 chips, I am really considering taking the leap.

Participant
January 3, 2025

Hello,

No improvement still? There has been some updates on LR  in recent weeks. 

I want to move from the Mac (mini M2) to Windows and looking for the ideal hw to support my LR needs.

I'm dealing with 50 photos at a time all being 20 MP, light editing and exporting. So its not a heavy usage 🙂

I have an old Dell Optiplex microPC with some 8th gen Intel in it, while the Mac exports in 40sec, the Optiplex in 2min40sec so i could live with it. That's without any GPU acceleration. But the interface of LR classic is awfully slow, it's even hard to navigate from one picture to the next.

I'm amazed how bad reviews there are about 185H and LR in general.

I looked at the NUC 14 Pro+ that would fit for me, but there is only ARC GPU.

So anybody out there with 185H laptops without dedicated GPU, is it really so bad regarding the interface? 

I'm puzzled how Adobe can screw this up so much on windows while on the Mac it's like a finder window, it's so fast 🙂

Even my 8th gen Intel can do anything very fast, but Adobe LRc interface behaves like it was from another world. I simply don't get it what programming technique can lead to such bad user experience on the GUI of LRc. I never saw another software with such laggy interface.

I don't want to buy anything bigger than a NUC though.

Please comment 🙂

Cheers

 

Inspiring
January 5, 2025

It seems the updates have helped a bit yes, at least overall interface wise. And culling/flying through the library does appear to be smoother. I haven't been in any deep sessions for a few weeks, so its hard to tell if this is acutally the case when i up the load. 
Loading an image in PS from LR is still oddly slow.

i just ran some noisy high ISO images through the Denoise, and the initial load-up is faster, but still approx 2,5 - 3 minutes of processing for a 24,5 MP RAW image. This is dissapointing to me. Maybe my expectations are too high and i will simply need a stronger machine, i honestly don't know. 

Perhaps the intel ultra 2nd gen (285H) will be better. I think its started to roll out. Might be worth a look for you. 
Generally mini PC's need to be top tier(and so their price) to be worth anything in my experience and opinion. Apple appears to be a lot better in that game. Especially in the last few years. And also their machines are designed for creatives, and they do a lot to have it running creative software more effortlessly.
Yes their specs:price is another world from PC's. But their price:performance is better, at least with regards to the adobe suite and the like. And with specs:perfomance - PC's cannot compare. For some reason they can get so much more out of hardware than PC's can.

Hahahah i can't believe i am starting to sound a bit like an apple fanboi. lol.
Never thought i'd see the day.

Participant
June 4, 2024

I'm currently running the Core Ultra 7 155H in my Dell Inspiron Plus 14" and so far have had nothing but problems with it. Crashing when using the generative tools, crashing when exporting, crashing when denoising, and sometimes just crashing when loading in images. At the moment, with these new chips, Lightroom is completely useless.

Participant
August 21, 2024

I'm running the Core Ultra 9 185H in an Asus Zephyrus G16. The app is slow and messy. Didn't encounter any crashes but it runs painfully slow. At the moment my old laptop (I7-9750H) is way faster.

Participant
September 22, 2024

Hey! Have you found any resolutions? I just bought this laptop and my first time editing was so disappointing. Laggy, freezing, crashing. CPU ran very hot. I have g helper and have tried various configurations but could be missing the mark. 

D Fosse
Community Expert
Community Expert
May 29, 2024

It looks very promising, but I'd give it a couple of months to see how it plays out, and to let them iron out the inevitable initial bugs. Jumping on a brand new platform is always risky.

 

More specifically, a GPU using shared system memory will require a lot more RAM than you would normally need, probably around double.

Participant
May 30, 2024

Hey D Fosse:

I agree, I won't be buying for a little while longer which also is why I'm asking around to see if anyone who does have a laptop with these core ultra chips to see how they perform and if the intergrated ARC graphics on those chips can be selected for GPU tasks like processing and exporting and if so hows the performence. If its not very good I'm always open to a machine with a dedicated GPU.