Copy link to clipboard
Copied
Hi!
I'm considering buying a thin and light equipped with Intel's new Ultra line CPU, the Core Ultra 5 125H.
The laptop specifically is the Lenovo Ideapad Pro 5. I mainly do event/concert photography, and want to carry my laptop with me, so I can shave some editing time off during travel. My current laptop is quite bulky and has a not-so-colour-accurate screen.
Just wondering if anybody has used Lightroom with this or another CPU from the Ultra series and what their experinces are?
https://www.linkedin.com/pulse/intel-hopes-reinvent-pc-core-ultra-soc-bob-o-donnell/
" For example, Intel said that Adobe’s full Creative Suite and the open-source Blender application will be among many applications supported with its first-generation AI accelerator"
This is from late 2023 so also wondering if anything came of this.
Copy link to clipboard
Copied
I never buy any hardware that hasn't been on the market for at least 6 months to settle bugs and compatibility issues. So I'd wait a while.
Still, this looks promising. I've always been very skeptical about Intel's integrated GPUs. They've both been underpowered and prone to all kinds of problems. But this just might be a significant step up, if we're to believe the blurbs. It even has a new "NPU" (neural processing unit). So this might be a more realistic competitor to Apple silicon for laptop use.
Maybe we can finally get rid of all the dreaded dual GPU laptops. I'm sure they'll draw a sigh of relief at Adobe over that.
Note that a laptop will always have suboptimal cooling. That's just mechanics and physics. So you never get full nominal performance. All components will be throttled down when push comes to shove. Also be very wary of all the crpware that all laptop manufacturers inevitably fill their machines with. That's what causes all the problems. This is where the vendors can put their own stamp on the product, and they grab it. There are many layers to peel through before you get down to the operating system.
Copy link to clipboard
Copied
" For example, Intel said that Adobe’s full Creative Suite and the open-source Blender application will be among many applications supported with its first-generation AI accelerator"
By @BalazsMudrak
That quote does need some qualification for a few reasons. How an AI accelerator helps will vary widely across Creative Cloud apps, simply because the use of AI varies widely. Some apps use it, some don’t. For the apps that have AI features, some people use those features a lot, some don’t. The impact of an AI accelerator is the greatest for those who use AI features a lot. Although I haven’t used an Intel Core Ultra, the following discussion kind of puts it and its AI accelerator into a general context compared to what else is out there.
For Lightroom Classic, at this time AI is mostly used for AI masks and especially AI Denoise. You do event/concert photography, so maybe you need to denoise many low light images that way. If you do, then what you need to know is that AI Denoise primarily depends on the GPU. So if it’s extremely important to you that large shoots can be AI denoised as quickly as possible, then you want a laptop with the best discrete laptop GPU you can afford. That laptop also needs to have an excellent cooling system so that potential performance isn’t throttled by high temperatures during bulk processing.
What about that AI coprocessor? My understanding (if it’s correct) is that they do help, but to a lesser degree; right now the processing power and graphics memory available to the GPU are more important. A concern is that the Intel AI coprocessor is first generation, while Nvidia and Apple are on their later generations.
And finally, that’s a friendly article about Intel developments. Keep in mind that many of the concepts in it, including an integrated SoC, efficiency cores, smaller process, and AI accelerator, represent things Intel is playing catch-up on compared to Apple, AMD, and now Qualcomm. Apple converted the Mac line to all of those things almost four years ago, and so they have a very power-efficient (and therefore cool and quiet) line of laptops with GPUs that perform quite well in Lightroom Classic, even on battery.
(Also, there is no “Creative Suite,” at least not any more. For the last 12 years, it’s been Adobe Creative Cloud. Yeah, I know that’s too much of a “grammar police” comment, but that’s just me, lots of people still call it Creative Suite.)
Copy link to clipboard
Copied
Well, let's give it a few months and we'll see how it performs in the real world.
Even if it's not likely to be a real competitor to Nvidia RTX, it may still be good enough for most laptop uses and hopefully break the dual GPU trend, which is causing so much trouble with Lightroom and Photoshop.
Anything that can clean up the rather messy laptop market is good.
Copy link to clipboard
Copied
Oh, one more important consideration with integrated GPUs: don't skimp on RAM. An integrated GPU uses shared system RAM, and it can eat up a lot. Consider the amount you would "normally" need, and double it.
Copy link to clipboard
Copied
@D Fosse @Conrad_C Thank you a lot for your replies they were really insightfull.
I ended up going down a different route, went with the Asus Zenbook S 13 with a Ryzen 7 7840u and I'm very happy with my decision. It's a blast editing on this, and have not experienced any performance issues with it. (Also, battery life is craaaaaaaaaaaaazy). So yeah, just leaving this here for other people who might be in the same boat as me! 🙂
Copy link to clipboard
Copied
What about AI denoise, how fast is it?
Copy link to clipboard
Copied
This thread has been talking about the CPU, but AI Denoise mostly uses the graphics hardware/GPU.
If you replied to BalazsMudrak beacuse you want to know how AI Denoise is on an Asus Zenbook 13, it looks like it depends on whether you order it with Intel Xe integrated graphics (slow Denoise) or Nvidia discrete graphics (somewhat faster Denoise). It will probably be slower than a desktop with a powerful Nvidia GPU.
Hopefully BalazsMudrak has tested Denoise and can actually tell you, but for Denoise the rule is, the more powerful the GPU the faster Denoise runs, and the CPU does not matter at all.
Find more inspiration, events, and resources on the new Adobe Community
Explore Now