Skip to main content
lucieneb30222720
Participant
April 25, 2023
Question

Denoise AI takes 10 minutes to render

  • April 25, 2023
  • 2 replies
  • 8179 views

I was so excited to update ACR, but denoise AI is unusable for me. I'm running a Windows 11 with 64 GB RAM, 2 solid state drives with 1 TB each, one of them empty, and my GPU is NVIDIA GTX 1060 with 8GB memory. ACR is set to use GPU. PS and Bridge run fast, but the new feature, Denoise AI takes 10 minutes to render. I thought it was a mistake when the display said 10 minutes, but it's real. Ten minutes is unusable, not to mention that the result, after waiting 10 minutes was poorer than Topaz denoise, which renders in less than 20 seconds. What's going on? Have I gotten a defective update? I've been reading that other users are running into the same problem, but one user reported a much inferior GPU  than mine. Do I have to buy a server to run ACR denoise? If so, it's really unusable.

2 replies

deejjjaaaa
Inspiring
April 26, 2023

Hi All

I noticed a post from Rikk Flohr ( apparently "Adobe Employee") = saying that Adobe's Ai NR takes "6.5 Years old home-built system with GTX970 - 4 GB Card takes <90 seconds for a Canon 5D Mark IV file."

Got me curious - I have a discrete GPU nVidia GTX1070Ti w/8Gb and most recent nVidia drivers ( not gaming versions ) installed.... the card works w/o issues with ACR and DxO PL6.* ( DeepPRIME XD) - however it takes 10-20 times longer than DxO NR for Adobe's NR.... and yet my card is "better" than Rikk Flohr's... so where might be the problem.... I was assuming that Adobe's Ai NR is using Tensor Cores on nVidia GPUs and in the absence of them it will be way-way slower  but the claim from Adobe employee that his way older card ( also w/o tensor cores ) can do AiNR for 30mp raw file in 90sec is ????

I downloaded ISO12800 Canon 5D IV raw from Imaging Resources to RAM disk (not even SSD but to RAM disk)

ACR time = 8 minutes for Ai NR ( for comparision Adobe's enhanced details Ai-Demosaick is 2 seconds )

So what I am missing ?!

The topic with the claim was locked  = https://community.adobe.com/t5/camera-raw-discussions/denoise-ai-takes-10-minutes-to-render/td-p/13750965



D Fosse
Community Expert
Community Expert
April 27, 2023

@deejjjaaaa 

 

Here's what I suspect will be typical timings for an *old* but otherwise decent GPU.

 

The one in question is a Quadro P2200, 5 GB, released 2019, but originally released as P2000 in 2017.

 

36 MP - 1:56

43 MP - 2:15

60 MP - 3:12

 

Other megapixel sizes can be extrapolated from those numbers, as it seems roughly proportional. 30 MP, for instance, should land at about 90 seconds.

 

This is all around 6 x slower than my other machine with an RTX 3060, which finishes 60 MP in 33 seconds.

D Fosse
Community Expert
Community Expert
April 27, 2023

again, there was a watershed of course - however Adobe employee illustrates ( 90sec timing ) that his GPU was after that and mine is even newer than his - so mine shall be cleary fully functional and deliver faster than 90 sec on the same raw files from Canon 5D mk IV - but it does not ... I was fully expecting mine to be slower w/o tensor cores, but his does not have them either... so tensor cores are not the reason... I am narrowing down to a very clear case GTX 970/4gb (his older, less capable - yet AiNR works fast) vs GTX1070Ti/8Gb  (mine newer more capable of the same line - yet AiNR works slow) - please do not involve RTX (with tensor cores) or Quadro... 


Has anyone ever mentioned how big these files are? Megapixels matter here.

 

AFAIK the quadro is old technology and fully comparable to a GTX x70/80 from the same period. Quadros were never particularly powerful in terms of computing power, just considered more sturdy and reliable.

TheDigitalDog
Inspiring
April 25, 2023
Author “Color Management for Photographers" &amp; "Photoshop CC Color Management/pluralsight"
lucieneb30222720
Participant
April 25, 2023

So I was right. I need a server or something that costs over $10K. Thanks a lot. As I said, the feature is unusable.

TheDigitalDog
Inspiring
April 25, 2023
quote

So I was right. I need a server or something that costs over $10K. Thanks a lot. As I said, the feature is unusable.


By @lucieneb30222720

No, you were not right. 

For $10K, you can buy three MacBook Pro M1s like mine that process a 24.4MP raw in 12 seconds. 

And no, the product isn't unusable as you define your issue. It works; it's too slow for you. Big difference. 

But yes, your old and outdated GPU is why you can wait 10 minutes, not use the new feature(s) or work with a different product. 

For others who may view this, the bottom line: 

Eric Chan says you probably ought to use 8GB for Denoise.

 

Author “Color Management for Photographers" &amp; "Photoshop CC Color Management/pluralsight"