• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
1

Denoise AI very slow in LrC, Lr, and Camera Raw

Explorer ,
May 06, 2023 May 06, 2023

Copy link to clipboard

Copied

Hello,

The new Denoise AI takes 13min each photo in all three raw photo editors. I had some updates from Intel so I installed them and the time came to 8min each photo. All the videos I've seen on YouTube take about 15 to 25 secs. to complete. I'm using a Dell Inspiron 3670 with 500GB of SSD and 1000GB of HDD. Windows 11 Home Version 22H2.
Processor Intel(R) Core(TM) i5-8400 CPU @ 2.80GHz 2.81 GHz
Installed RAM 24.0 GB 
System type 64-bit operating system, x64-based processor
My system is pretty old and I don't want to purchase a graphics card. Does anybody have any ideas??
Any help would be appreciated.

TOPICS
Windows

Views

3.5K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 07, 2023 May 07, 2023

Copy link to clipboard

Copied

If you use an old system and don't want to purchase a better graphics card, then the only idea I have for you is make a lot of coffee for while you wait for denoise to do its job.

 

-- Johan W. Elzenga

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 07, 2023 May 07, 2023

Copy link to clipboard

Copied

The parts that accelerate AI Denoise the most are good and recent graphics hardware, and machine learning acceleration. The PC graphics hardware that tends to run AI Denoise the fastest tend to have both, such as an Nvidia graphics card that includes Tensor Cores. Those graphics cards tend to be very recent. When you see an AI Denoise result on YouTube that’s under 1 minute, chances are their graphics hardware will be of that type, and less than 2 years old.

 

Although I use Macs, the differences are similar. My old laptop has an 8th generation Intel Core i5 with integrated graphics, a similar processor to yours. For a 16 megapixel raw photo, it does AI Denoise in about 3 minutes. My current laptop, about 1.5 years old with graphics on an Apple Silicon processor, can AI Denoise the same photo in about 18 seconds.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
May 07, 2023 May 07, 2023

Copy link to clipboard

Copied

This has been discussed numerous times already - a site search would have saved you the need to post at all: if you want reasonably quick Denoise AI processing, you need a graphics card that's up to the job.

 

Adobe unofficially recommends something with 8gb+ of RAM.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
May 07, 2023 May 07, 2023

Copy link to clipboard

Copied

See: https://blog.adobe.com/en/publish/2023/04/18/denoise-demystified

 

Need for speed. Denoise is by far the most advanced of the three Enhance features and makes very intensive use of the GPU. For best performance, use a GPU with a large amount of memory, ideally at least 8 GB. On macOS, prefer an Apple silicon machine with lots of memory. On Windows, use GPUs with ML acceleration hardware, such as NVIDIA RTX with TensorCores. A faster GPU means faster results.

Author “Color Management for Photographers" & "Photoshop CC Color Management/pluralsight"

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
May 22, 2023 May 22, 2023

Copy link to clipboard

Copied

Cores, tensor, and all of those terms are way over my head.  So it would be very helpful if you or someone could provide or link to a list of Nvidia & AMD that would meet the need for reasonably fast AI performance on a Win10x64 machine.  Ideally under $600 for cost.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 22, 2023 May 22, 2023

Copy link to clipboard

Copied

Nvidia RTX 30-series. From 3050 (cheap) to 3090 (expensive). Take your pick.

 

They all work well for this, but the more expensive the faster.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
May 23, 2023 May 23, 2023

Copy link to clipboard

Copied

Thanks for the recommendation. Now I'm wondering if the "TI" designation
for Nvidia means anything more than just stepped-up performance, and if it
is worth the extra $$ for it. My work is almost all still photo editing of
large 45-60 files in Nikon Raw. I don't see any failures in my older GPU
(ancient Geforce) except now that AI is so prevalent and likely to become
even more so, I don't mind spending $500-$600, but just want to be sure I'm
spending on the right cost/performance balance.



Jim Smith
60833 Willow Creek Loop, Bend, OR 97702

Phone: (Cell): 949-212-1578
Email:
JSmithDisp1@BendBroadband.com

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Mar 26, 2024 Mar 26, 2024

Copy link to clipboard

Copied

LATEST

All what Adobe says seems to make sense, but this terms (Tensor Cores, Neural processigs...) it is only planned obsolescence for old machines. I usually use Blender and after 4.0 version integrate automatic denoise after rendering what provides to you lounch render with a low time sampling. At the same time, users says in a similar post to this, that Topaz use a low time denoise with same machine specs. This is not normal, i think that they pull us the wool over our eyes. It's knows how works technology. I got two gpu with 6 cores each one 24 threads, 64 gb RAM, Radeon 5700 XT with 8 GB RAM and I hope not 10 seconds to denoise, but 10 minutes with a 6 megapixels photography, come off it!

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Jun 05, 2023 Jun 05, 2023

Copy link to clipboard

Copied

Honestly I would die for those 13 minutes right now...for me it takes 80minutes per photo and idk how to fix it. I downloaded photoshop on my new laptop now and it still takes 50 minutes. I'm going to freak out...

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Jun 05, 2023 Jun 05, 2023

Copy link to clipboard

Copied

quote

Honestly I would die for those 13 minutes right now...for me it takes 80minutes per photo and idk how to fix it. 


By @Luna303012227mu6

 

 

How to fix it has already been discussed in this thread and solutions presented.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Jun 05, 2023 Jun 05, 2023

Copy link to clipboard

Copied

As you can see fronm the replies to my first post on the subject, there are numerous reminders that this has been discussed, including some scolding me for not doing a thorough research.  But in the spirit of helping, I'll tell you that my time went from 50-65 minutes to 12-20 seconds for full 60-85mb Raw files when using LrC DeNoise after I replaced my desktop graphics card with an NVidia 3060 TI.  ($429 at BestBuy).  Similar times with both Topaz and On1 denoising apps.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Jun 20, 2023 Jun 20, 2023

Copy link to clipboard

Copied

I noticed the following in my situation (Intel i7 6800K, GeForce GTX 1060, Windows 10, Canon CR files).

When I open a CR file that does not have nearby an XMP-file with the same name, AI denoising processing takes 1 minute.
Being next to the eponymous XMP-file "spoils" Photoshop and all subsequent AI denoising processing attempts last 60 minutes. The situation is "cured" by rebooting the operating system only.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Jun 20, 2023 Jun 20, 2023

Copy link to clipboard

Copied

Right from the start, it has been recommended to run Denoise on unadjusted files straight from the camera. This is apparently why. You may not notice a difference with a new GPU optimized for AI/machine learning, but on an old GPU like the GTX 1060 it may make a significant difference.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Jul 18, 2023 Jul 18, 2023

Copy link to clipboard

Copied

So I guess it's perfectly OK from the point  of view of Adobe to let people blunder into a massive fail when they select to denoise with AI on an older system. I guess no one thought it would be worthwhile to display a warning against proceeding without a high-powered GPU. Do you realize your tone comes across as haranguing for people who are just trying to use the product? Do you think a little more sympathy is in order?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Jul 18, 2023 Jul 18, 2023

Copy link to clipboard

Copied

"Do you think a little more sympathy is in order?"

 

Not really. 

 

You're going over ground that has been covered innumerable times already - it's nobody else's fault that you haven't put the effort into researching the ins and outs of  AI Denoise from a system requirements point of view.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Jul 18, 2023 Jul 18, 2023

Copy link to clipboard

Copied

quote

So I guess it's perfectly OK from the point  of view of Adobe to let people blunder into a massive fail when they select to denoise with AI on an older system. I guess no one thought it would be worthwhile to display a warning against proceeding without a high-powered GPU. 


By @Hume31149445vdol



 

Adobe provides right there on the Lightroom Classic screen an estimate of how long this Denoise will take. Some people report that the estimate is over an hour. Adobe also provides in some cases a statement that Denoise is not available. That sounds exactly what you are asking for that you claim Adobe doesn't provide.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Jul 18, 2023 Jul 18, 2023

Copy link to clipboard

Copied

quote

…a warning against proceeding without a high-powered GPU.

By @Hume31149445vdol

 

The time estimate that’s already in there is a lot more appropriate than putting up some blunt, alarmist warning, because it’s more a nuanced decision than than that. One person might think 2 minutes is OK because they are only going to process one image, another person might think 2 minutes per image is unacceptable because they need to batch process 200 of them. The time estimate provides full disclosure that lets each person decide whether it’s worth continuing, for their own situation.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines