• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

Do Intel NUC need a separate graphics card?

New Here ,
Apr 24, 2023 Apr 24, 2023

Copy link to clipboard

Copied

I need to upgrade from my 5 year old laptop and am considering the Intel NUC which does not have a graphics card but uses Intel Iris Xe graphics in its processor.  I can't tell if this meets the minimum or recommended requirements for LR and PS.   Can you advise whether it does or whether a separate graphics card will be necessary.   i think the relevant specs are below

 

ProcessorRAMHard DriveGraphics Coprocessor

‎2.1 GHz core_i7
‎32 GB
‎1 TB SSD
‎28W Intel® Iris™ Xe Graphics, up to 1.4 GHz

 

TOPICS
Windows

Views

1.9K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Apr 24, 2023 Apr 24, 2023

Copy link to clipboard

Copied

Not recommended. You should look at something like Nvidia RTX 3050 and upwards.

 

Also note that dual GPUs can conflict and should be avoided. This is a common problem with laptops, and the integrated GPU may need to be disabled.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Apr 24, 2023 Apr 24, 2023

Copy link to clipboard

Copied

Thanks.   What I feared.   

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

You could look at Mini PCs with the following combo:
AMD Ryzen 9 6900HX (a CPU without integrated GPU)
AMD Radeon RX 6600M (a discreet GPU soldered onto the motherboard)

AMD's product page for the : 6900HX: https://www.amd.com/en/products/apu/amd-ryzen-9-6900hx 

AMD's product page for the RX 6600M: https://www.amd.com/en/products/graphics/amd-radeon-rx-6600m 

Just as an example, a YouTube review of such a mini PC:

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

AMD Ryzen 9 6900HX (a CPU without integrated GPU)

Of course I meant to write "AMD Ryzen 9 6900HX (a CPU with integrated GPU)". The iGPU is a Radeon 680M. Sorry about this.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

This reply isn’t only for you, but general background for others that might read this. In the past, Intel integrated graphics were OK but not ideal, in that they worked fine for general photo editing, like for basic adjustments and simple masking. And Intel Xe is a modest improvement over what we used to have to put up with before.

 

What changed everything was higher megapixel cameras (40 to 100 megapixels), having to drive higher resolution displays such as 4K and up, and most of all, the introduction of AI/machine learning features. Those have raised the bar for GPUs considerably. Now, Intel integrated graphics are often not good enough to use the latest AI features such as people masking, and can be very slow for the popular new AI Denoise feature. If you expect to use those features often, the latest discrete GPU hardware from Nvidia (or the graphics hardware integrated into the Apple Silicon SoC) can complete those tasks many times faster than older hardware.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Jun 04, 2023 Jun 04, 2023

Copy link to clipboard

Copied

Great information about Adobe CC's high-end requirements.
But the following could have been more detailed :

quote
 ..... If you expect to use those features often, the latest discrete GPU hardware from Nvidia (or the graphics hardware integrated into the Apple Silicon SoC) can complete those tasks many times faster than older hardware.
By @Conrad_C

" ..... , the latest discrete GPU hardware from Nvidia (or the graphics hardware integrated into the Apple Silicon SoC)" is very vague, not helpful and will not age well. Just like the system requirements by Adobe. So please do not experience this as an attack.

Just asking you to be more specifice and name one Nvidia GPU and one Apple Silicon iGPU which fullfill those high-end demands you mention.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Jun 04, 2023 Jun 04, 2023

Copy link to clipboard

Copied

No problem, it’s a reasonable question. The only reason I was vague is because it depends on one’s expectations and their budget.

 

There is a great thread over at The Lightroom Queen forums (My Recent GPU Experience (Gigabyte Nvidia RTX 4070 Ti 12GB)) about the performance of the very demanding AI Denoise on different GPUs, using the same 60 megapixel raw test file one user posted. People download the file and try it on their systems, and post their results there for you to look at. The best times seem to be achieved (on Windows) with an Nvidia GeForce RTX 4070, and (on macOS) on Apple Silicon M2 Max. Those achieve times of around 10 to 20 seconds for that test.

 

My own older and lower spec M1 Pro MacBook Pro runs the test in about one minute. Which I am happy with on my budget, because on PCs and Intel Macs with older graphics, the same test takes from several minutes to over half an hour for the same single image. The plain M1 takes around two minutes.

 

If you aren’t going to be running features as intensive as AI Denoise on 60-megapixel images all the time, then you don’t need to pay what that kind of GPU power costs. Instead you can aim for a more affordable GPU, like an RTX 3060 or the one in an Apple M1. Those should work well enough for Lightroom Classic AI masking, and Photoshop AI features such as Object Selection and Neural Filters.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Jun 04, 2023 Jun 04, 2023

Copy link to clipboard

Copied

Thanks! This is golden information. And one more thanks for providing a reference, with backup too (The Lightroom Queen Forums).

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Jun 05, 2023 Jun 05, 2023

Copy link to clipboard

Copied

I have an RTX 3060, and it does 60MP in 33 seconds. I sent a file to a friend with an RTX 3090, and he reported back 13 seconds.

 

So that's the range you can take your pick from.

 

By contrast, I have another machine with an older card - a Quadro P2200 from 2019. It denoises the same file in two minutes and 30 seconds.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Jun 06, 2023 Jun 06, 2023

Copy link to clipboard

Copied

LATEST

Thanks very much. I'll probably go for the RTX 3060 TI OC 8GB GDDR6X  (by Asus) mentioned in the Lightroom Queen thread.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines