• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

Pc desktop for editing

New Here ,
May 19, 2024 May 19, 2024

Copy link to clipboard

Copied

Views

755

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 1 Correct answer

Community Expert , May 20, 2024 May 20, 2024

The first link, the Asus PA279, should be an above-average choice for a photo editing display. No problem there. Its features are similar to the photo editing displays labeled “PA” by NEC and “SW” by BenQ.

 

The HP workstation seems like an old basic office workstation, not a graphics PC. The Xeon CPU has several major problems. It was first released around 2014, so it’s already a whole decade old. Also, for any multimedia work, many (all?) Xeon CPUs lack the Intel coprocessors that accelerate t

...

Votes

Translate

Translate
LEGEND ,
May 20, 2024 May 20, 2024

Copy link to clipboard

Copied

Let me just say that large monitor (4K) with Lightroom Classic 13.2 requires lots of computer power and a strong GPU. If you're on a tight budget, stick with 1920x1080 monitors. Even with a 1920x1080 monitor, I would get the fastest CPU you can afford and the best GPU you can afford. 

 

I don't think these computers are powerful enough and LrC 13.2 will be slow.

 

Even worse, the HP doesn't mention what graphics card it comes with and doesn't specify what CPU it contains. The Dell also doesn't specify what CPU it uses, "eighth gen i7" isn't specfic enough, but these were launched around 2017, indicating it is far behind modern CPU in speed. This indicates to me that both computer ads are trying to disguise the actual (lack of) power of its hardware.

 

Avoid these.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 20, 2024 May 20, 2024

Copy link to clipboard

Copied

The first link, the Asus PA279, should be an above-average choice for a photo editing display. No problem there. Its features are similar to the photo editing displays labeled “PA” by NEC and “SW” by BenQ.

 

The HP workstation seems like an old basic office workstation, not a graphics PC. The Xeon CPU has several major problems. It was first released around 2014, so it’s already a whole decade old. Also, for any multimedia work, many (all?) Xeon CPUs lack the Intel coprocessors that accelerate things such as video editing. For media work on Intel, the Intel Core series is preferred over Intel Xeon. Its NVIDIA Quadro M6000 graphics card was released around nine years ago and may be too weak for many current Lightroom Classic GPU-accelerated features. This computer seems like a terrible choice that is already very old and outdated.

 

The Dell workstation also has old components. The CPU was released 7 years ago, and one website says “This is a fairly old CPU that is no longer competitive with newer CPUs.” The Nvidia GTX 1060 6GB graphics card is about 8 years old.

 

All of this info I found using web searches. Those computers might run Photoshop OK for basic work, but the more you want to run the latest features such as AI Denoise in Lightroom Classic and generative AI in Photoshop, the more you want current hardware, like a 13th gen Intel CPU and NVidia GPU in the 30x0-40x0 model range. It isn’t always necessary to get hardware that recent, but the thing is, in the last few years there have been many advances in CPUs and GPUs that were not present in earlier generations, and many current graphics apps take advantage of those advances to make GPU acceleration and AI work much faster than the hardware from just 4 or 5 years ago.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 20, 2024 May 20, 2024

Copy link to clipboard

Copied

Thank you. 
Hmm, I currently just do light editing to my photos, light exposure adjustments etc, the biggest thing i will probably do is use general aware fill (i think it's called that) in photoshop.

I don't generally use ai, denoise or other big demanding tasks. 
So the best out of the 2 would be the dell?

it will be definitely better then the specs i have on my all in one. 🙂

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 20, 2024 May 20, 2024

Copy link to clipboard

Copied

I agree with the others. This is old and obsolete hardware, a terrible investment. Even if it works reasonably OK today (which I honestly doubt), you may get problems in the very next update.

 

The GPU is particularly critical, and that's for basic operation, not just the advanced functions. Both the GTX 1060 and the Quadro M6000 are borderline now. They will not work particularly well.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
May 20, 2024 May 20, 2024

Copy link to clipboard

Copied

quote

Thank you. 
Hmm, I currently just do light editing to my photos, light exposure adjustments etc, the biggest thing i will probably do is use general aware fill (i think it's called that) in photoshop.

I don't generally use ai, denoise or other big demanding tasks. 
So the best out of the 2 would be the dell?

it will be definitely better then the specs i have on my all in one. 🙂


By @defaultzpr9k1gna9vx

 

Today you only do "light editing", tomorrow you might want to expand and do more with LrC, and you won't be able to because the hardware isn't capable.

 

I agree with @D Fosse , this is a poor investment. Money is better spent on new computer; skip the 4K monitor, get a 1920x1080 monitor, and with the money not spent on the 4K monitor, get a better computer. Such as this one: https://www.cyberpowerpc.com/system/Infinity-8000-Gaming-PC (I don't know if it is available in your country)

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 20, 2024 May 20, 2024

Copy link to clipboard

Copied

Thank you.

Yeah true that.

I thinks its because I am getting too eager to start using a better setup with not too big of a budget.

I can afford one but not the other, so I will hold off untill i can afford both so i am futureproofed so i don't need to upgraded again.

Yeah i am not too concerend about 4k, its just this moniter had the specs i wanted and it was 4k, so i might look around and see if i can find a 2k.

I have not seen that particular tower but have seen similar so will look at them pick one and wait abit till i can get both.

Thanks again 🙂

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 20, 2024 May 20, 2024

Copy link to clipboard

Copied

LATEST

Actually, 4K is no advantage for photography.

 

You need to get a feel for the pixel structure at 100% view, to properly assess sharpness and noise. You don't really get that on 4K. The ideal screen pixel density for photography is somewhere around 100-110 pixels per inch, and that's what you get with a traditional 27 inch screen at 2560 x 1440 pixels. That's why so many photographers pick this panel size.

 

It's a different story for vector art and text. That's where you get maximum payoff for 4K. So if you're a heavy user of InDesign and Illustrator, it's worth it.

 

I would also assume it's useful to see 4K video at native resolution.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines