Copy link to clipboard
Copied
I was interested to try out the new Denoise AI in 12.3. It seemed to work pretty well in some preliminary tests, though 1 image produced a ton of green artifacts in what started ot as a black sky (with a lot of noise).
The major issue for me, though, is that the module reports it will take it 12 minutes or more to render the DNG file. I didn't time this, but it was certainly more than 10 minutes. I've watched a demo/tutorial and their video showed expected times in the 10 to 15 SECOND reange. Nothing else I use in LR has major delays.
My computer is 5 years old and won't support Windows 11, but nothing else takes an outrageous amount of time. Exporting DNG files for Panoramas might take a few seconds, but nothing like this. Is this a processor issue for me or are others experiencing this sort of time issue?
I think it depends on the speed and amount of memory of your GPU. The AI noise reduction task is the most computationally intensive tool in LrC. A 5 year old computer most likely does not have the GPU power to do this in 10-15 seconds.
Copy link to clipboard
Copied
Checked some older 36 MP files, and the timings are reduced in proportion. So it seems there is little overhead, and timings are more or less directly dependent on pixel size.
Copy link to clipboard
Copied
I have a 5+ year old PC with 32 Gigs of RAM and an i3 Intel chip. Toapz DeNoise and AI Photo processes the converted JPEG in <30 sec. So assume the delay is also exacerbated (pulled that one out) by converting to and processing the DNG rather than a JPEG.
Copy link to clipboard
Copied
Which camera?
What GPU?
(We need context for your timings to have relevance - RAM and CPU aren't really the issue here.)
Copy link to clipboard
Copied
Sorry. Z9, no GPU. Just onboard Intel graphics chip
Copy link to clipboard
Copied
Might Adobe publish a list of tested and approved Graphics Cards that will work well with these new LRC AI functions? I happen to be on a Win 10 PC with an AMD Ryzen 7 3700X 8-Core processor, 32GB of Ram, and an Nvidia GeForce GTX 1660 graphics card with 6GB ram. I have recently updated to the latest Studio driver for the card. I am still having occasional issues with Denoise and some of the other AI functions. I often will get a Runtime Error which shuts down the GPU and usually required a reboot of LRC. Also, the Performance tab under Preferences reports I have "Limited graphics acceleration." What is required to get Full Graphics Acceleration and get the most performance from LRC ?
Copy link to clipboard
Copied
Hello,
Not sure if this is the right thread, but all other threads which might to be related had been closed....
https://community.adobe.com/t5/lightroom-classic-discussions/ai-denoise-takes-4-8-minutes/td-p/13746...
However, I couldn't find any information about partial problems which only appear sometimes.
I also had partial problems with then AI denoise. The performance drops after denoising some images, from about 10-20 seconds to several minutes. When it is slowed down, it also starts flickering as described in some other threads. This does not appear as long the it works fast.
Restarting Lightroom solves the issue until it appears again... I already read through some threads and I think the AI denoise feature is still buggy.
I really hope for an update that will fix these problems soon.
I run Windows 11 on a Schenker XMG Core Notebook with Nvidia GTX 1660Ti with latest GPU drivers installed.
Lightroom also says that it only suports my gpu partly (at the moment). GPU Usage is set to auto.
Best regards,
Alex
Find more inspiration, events, and resources on the new Adobe Community
Explore Now