Copy link to clipboard
Copied
Has anyone with a laptop running the new Intel Core Ultra 7 155H or 9 185H chips checked how lightroom performs on these new chips and can their integrated ARC graphics handle any GPU acceleration tasks in LR? Or it that something I still need a dedicated GPU for?
Been using lightroom on iPad OS and Android and while it works, the workflow is kind of clunky, slow and lacks features that the desktop lightroom programs have (like the AI Denoise feature) and lightroom moible doesn't seem to reconizse the image stacks I made on the desktop versions.
So I've been considering getting a laptop for photo editing (among other things) and would like to know if the if those new Core Ultra chips and their Integrated ARC Graphics are a good fit and and handle at least some of the GPU accerated tasks that would normually be offloaded to a battery sucking GPU.
Copy link to clipboard
Copied
It looks very promising, but I'd give it a couple of months to see how it plays out, and to let them iron out the inevitable initial bugs. Jumping on a brand new platform is always risky.
More specifically, a GPU using shared system memory will require a lot more RAM than you would normally need, probably around double.
Copy link to clipboard
Copied
Hey D Fosse:
I agree, I won't be buying for a little while longer which also is why I'm asking around to see if anyone who does have a laptop with these core ultra chips to see how they perform and if the intergrated ARC graphics on those chips can be selected for GPU tasks like processing and exporting and if so hows the performence. If its not very good I'm always open to a machine with a dedicated GPU.
Copy link to clipboard
Copied
I'm currently running the Core Ultra 7 155H in my Dell Inspiron Plus 14" and so far have had nothing but problems with it. Crashing when using the generative tools, crashing when exporting, crashing when denoising, and sometimes just crashing when loading in images. At the moment, with these new chips, Lightroom is completely useless.
Copy link to clipboard
Copied
I'm running the Core Ultra 9 185H in an Asus Zephyrus G16. The app is slow and messy. Didn't encounter any crashes but it runs painfully slow. At the moment my old laptop (I7-9750H) is way faster.
Copy link to clipboard
Copied
Hey! Have you found any resolutions? I just bought this laptop and my first time editing was so disappointing. Laggy, freezing, crashing. CPU ran very hot. I have g helper and have tried various configurations but could be missing the mark.
Copy link to clipboard
Copied
I've had my lenovo laptop running the ultra 9 185H with 32 gb of RAM since April. And i am now seriously considering shifting to a mac. It doesn't crash, but is annoyingly slow for something said to be good at running these adobe programmes with their new AI features. And since i don't game or do any heavy video work i'd prefer to be able to not have to have a computer that runs a dedicated GPU as that will easily get me up there in the somewhat same price:perfomance range.
And now with the benchmark reviews coming out on the m4 chips, I am really considering taking the leap.
Copy link to clipboard
Copied
Hello,
No improvement still? There has been some updates on LR in recent weeks.
I want to move from the Mac (mini M2) to Windows and looking for the ideal hw to support my LR needs.
I'm dealing with 50 photos at a time all being 20 MP, light editing and exporting. So its not a heavy usage 🙂
I have an old Dell Optiplex microPC with some 8th gen Intel in it, while the Mac exports in 40sec, the Optiplex in 2min40sec so i could live with it. That's without any GPU acceleration. But the interface of LR classic is awfully slow, it's even hard to navigate from one picture to the next.
I'm amazed how bad reviews there are about 185H and LR in general.
I looked at the NUC 14 Pro+ that would fit for me, but there is only ARC GPU.
So anybody out there with 185H laptops without dedicated GPU, is it really so bad regarding the interface?
I'm puzzled how Adobe can screw this up so much on windows while on the Mac it's like a finder window, it's so fast 🙂
Even my 8th gen Intel can do anything very fast, but Adobe LRc interface behaves like it was from another world. I simply don't get it what programming technique can lead to such bad user experience on the GUI of LRc. I never saw another software with such laggy interface.
I don't want to buy anything bigger than a NUC though.
Please comment 🙂
Cheers
Copy link to clipboard
Copied
It seems the updates have helped a bit yes, at least overall interface wise. And culling/flying through the library does appear to be smoother. I haven't been in any deep sessions for a few weeks, so its hard to tell if this is acutally the case when i up the load.
Loading an image in PS from LR is still oddly slow.
i just ran some noisy high ISO images through the Denoise, and the initial load-up is faster, but still approx 2,5 - 3 minutes of processing for a 24,5 MP RAW image. This is dissapointing to me. Maybe my expectations are too high and i will simply need a stronger machine, i honestly don't know.
Perhaps the intel ultra 2nd gen (285H) will be better. I think its started to roll out. Might be worth a look for you.
Generally mini PC's need to be top tier(and so their price) to be worth anything in my experience and opinion. Apple appears to be a lot better in that game. Especially in the last few years. And also their machines are designed for creatives, and they do a lot to have it running creative software more effortlessly.
Yes their specs:price is another world from PC's. But their price:performance is better, at least with regards to the adobe suite and the like. And with specs:perfomance - PC's cannot compare. For some reason they can get so much more out of hardware than PC's can.
Hahahah i can't believe i am starting to sound a bit like an apple fanboi. lol.
Never thought i'd see the day.
Copy link to clipboard
Copied
The Intel GPUs aren't really sufficient for Photoshop.
You say 2.5 minutes Denoise for 24 MP. A Nvidia RTX 3060/4060 does this in less than 10 seconds.
Copy link to clipboard
Copied
Guys, are we talking about classic de noise or the new AI based one? Latter being incredibly slow. Even on my M2 mac it's multiple minutes. I'm assuming classic de noise takes less than 10 sec as RTX doesn't have AI neural engine CPU component. What do you think?
Copy link to clipboard
Copied
RTX has Tensor cores specifically for AI. The older GTX line didn't and those GPUs take several minutes with Denoise.
I haven't noticed any speed differences between "old" and "new" Denoise, with RTX 3060 and 4060 (two machines). My raw files are 45 to 60 MP, and take 15 to 25 seconds on the 4060, a few seconds more on the 3060.
(By "new" Denoise I mean the ACR technology preview, where it works directly on the raw file, re-adjustable, without creating a separate DNG. This option is not yet available in LrC).
Copy link to clipboard
Copied
I've come to that realisation.
I just had the impression, when acquiring this setup, that the the new Arc graphics would be.
I assume Intel is perhaps aiming towards doing what apple can do with their silicon chips. But maybe that just wont happen or at least presently. Its just mindboggling to me how apple can run adobe programs so well on integrated gpu when PC's just cant.
Idk how well the new generation of intel ultras are running though, compared to the first.
Copy link to clipboard
Copied
At the moment there isn't a big market for high-end integrated GPUs on the Windows side, with Nvidia RTX being so reliable and readily available, and at lower cost for similar performance.
If Intel decides to push this, which they might, they really need to create an upscale market for it by offering complete hardware packages. That's a lot simpler if you already have fierce brand loyalty, as Apple has.
Apple made the choice to develop high-end GPUs integrated with the CPU. That's not without costs and drawbacks and it's not an "ideal" solution.