Copy link to clipboard
Copied
I was interested to try out the new Denoise AI in 12.3. It seemed to work pretty well in some preliminary tests, though 1 image produced a ton of green artifacts in what started ot as a black sky (with a lot of noise).
The major issue for me, though, is that the module reports it will take it 12 minutes or more to render the DNG file. I didn't time this, but it was certainly more than 10 minutes. I've watched a demo/tutorial and their video showed expected times in the 10 to 15 SECOND reange. Nothing else I use in LR has major delays.
My computer is 5 years old and won't support Windows 11, but nothing else takes an outrageous amount of time. Exporting DNG files for Panoramas might take a few seconds, but nothing like this. Is this a processor issue for me or are others experiencing this sort of time issue?
I think it depends on the speed and amount of memory of your GPU. The AI noise reduction task is the most computationally intensive tool in LrC. A 5 year old computer most likely does not have the GPU power to do this in 10-15 seconds.
Copy link to clipboard
Copied
I think it depends on the speed and amount of memory of your GPU. The AI noise reduction task is the most computationally intensive tool in LrC. A 5 year old computer most likely does not have the GPU power to do this in 10-15 seconds.
Copy link to clipboard
Copied
thanks to all. This is the answer I was expecting, but hoped for a different result.
My computer runs Intel i5 @ 2.5 gHz with 20 GB Ram. Nothing else seems to take a long time to complete inside LR, though it is noticeably slower than what I can see others experrience in other editing in demonstrations. Topaz AI is slow, sometimes taking 45 seconds on my R6 RAW files but 13 minutes with not much else affected--it just seemed wrong. I've been considering buying a new machine. Five years old does seem to be the life span of computers, even with no problems encountered.
Copy link to clipboard
Copied
Yep, what DJ said.
For context, my (Win 10) PC takes about a minute to render Canon R7 cRAW files - this is a machine with a far-from-cutting-edge Radeon RX580 8gb - so it's probably safe to assume that your GPU is markedly less powerful than that.
As a further point of reference, the same machine will spit out a tiff of the same file from On1 Photo Raw 2023 in about 10-15 seconds, and the results are significantly better.
(Why Adobe insists on creating a DNG file in preference to writing a tiff or a jpeg, baffles me. It means that I can no longer go from RAW file to finalised output within Lr, in one run - which adds precisely nothing of value to the experience.)
Copy link to clipboard
Copied
Yes, the GPU is pretty crucial here. I've run it a few times on two machines with different GPU, but otherwise almost identical hardware.
One machine has an RTX 3060 with 12 GB, and it takes about 15 - 20 seconds on 45-60 MP raw files.
The othe has an older Quadro P2200 with 5 GB, and here it takes 2 - 3 minutes on the same files.
Everything works without problems, but the speed difference is dramatic.
Copy link to clipboard
Copied
There are GPUs and then there are GPUs with cores dedicated to Machine Learning / AI tasks. For example, Tensor cores on more recent NVidia GPU cards and Neural Engine on Aple Silicon Macs. The ML / AI cores make a huge difference to the time taken for any of the AI based tasks in LrC / LrD and ACR.
What does above mean?
Take a typical first generation M1 Mac from late 2020 - it has 8 GPU cores and 16 Neural Engine cores. Time to Denoise a Canon EOS R5 file is around 50 seconds. Now take the most powerful of the M1 generation Macs from mid 2022 - the M1 Ultra has 48 GPU cores and 32 Neural Engine cores, and can process the same EOS R5 file in 15 seconds.
Copy link to clipboard
Copied
I need to correct myself as the information provided re Denoise using the Neural Engine is not correct. Currently, the Apple Neural Engine found in M1/M2 silicon Macs is not used by Denoise. It's expected that performance will be even better when Denoise does use the Apple Neural Engine.
There is no change to above comments regarding the use of Tensor cores in NVidia cards. That is, Tensor cores are used by Denoise.
Copy link to clipboard
Copied
Where do you have this information from? The original article says
Finally, we built our machine learning models to take full advantage of the latest platform technologies, including NVIDIA’s TensorCores and the Apple Neural Engine. Using these technologies enables our models to run faster on modern hardware.
Copy link to clipboard
Copied
Apparently, an issue on Apple side means that the Neural Engine is not used by Adobe Denoise. When the issue is addressed, then Adobe Denoise is ready to take advantage of the Neural Engine.
Copy link to clipboard
Copied
@Ian Lyons information is correct regarding the Neural Engine on Apple devices
Copy link to clipboard
Copied
Here is a long thread on the DXO forum that has been going for 6+ months about color problems when using the Apple Silicon Neural Engine, but not when using the Apple Silicon GPU:
https://feedback.dxo.com/t/deepprime-xd-introduces-purple-overlay-chromatic-aberration/29191/17
Is this the Neural Engine problem that Adobe found and is waiting for Apple to fix?
Copy link to clipboard
Copied
DXO even has a big red notice about the Apple Neural Engine bug:
Copy link to clipboard
Copied
I don't know the specifics, only that there is a bug related to the Apple Neural Engine that Adobe avoids by not using the Neural Engine.
I don't use DxO products, and until you posted a link had no knowledge of the discussion.
Copy link to clipboard
Copied
DXO has been waiting for 6+ months for this Neural Engine bug to get fixed by Apple. Apparently it was introduced in Ventura because people in the DXO forum say that it works properly in Monterey.
Apple is notorious for not fixing bugs and the bugs just hang around for many years, release after release. I hope this is not one of those cases. I assume Adobe is more high profile and has more pull with Apple to get Neural Engine bugs fixed though. Fingers crossed.
Copy link to clipboard
Copied
Has anyone heard any update on this problem with regards to the Apple Neural Engine that prevents Adobe from using it with Lightroom Denoise AI?
On another forum someone with DXO DeepPrime NR reported that using the Neural Engine on his Apple Silicon Mac is 4 times faster than using the GPU! Maybe if Lightroom Denoise AI could use the Neural Engine there would be a similar speed up for us.
Copy link to clipboard
Copied
Since Adobe Denoise doesn't currently use the Neural Engine, then anyone puting a figure on the speed up would be guessing.
Copy link to clipboard
Copied
In April (up above) you @Ian Lyons reported here that Lightroom is unable to use the Neural Engine because of a problem on the Apple side. @Rikk Flohr: Photography then confirmed that info. Did you determine the Neural Engine is not used based on your own experimentation? How or where did you and Rikk get this info? It completely contradicts the official Abobe statement written by Eric Chan in his paper:
Denoise demystified
https://blog.adobe.com/en/publish/2023/04/18/denoise-demystified
Finally, we built our machine learning models to take full advantage of the latest platform technologies, including NVIDIA’s TensorCores and the Apple Neural Engine. Using these technologies enables our models to run faster on modern hardware.
Have you seen any updated info about this? I have not seen anything from Adobe or Apple that provides any additonial info, but if someone has some then please share it. Thank you.
Copy link to clipboard
Copied
"I have not seen anything from Adobe"
I am an Adobe Employee. You've been given updated info by Ian which was confirmed by me.
Speaking generally, there are sometimes late-breaking issues that cause the delay in deployment of various subsets of features. There is nothing in the statement you quoted above that precludes the existence of an issue or a pending resolution.
Copy link to clipboard
Copied
Ventura 13.4 was released yesterday. I can find no mention of any Neural Engine bugs fixed. If this problem has not been fixed then this is the 7th Ventura update without a fix.
Apple releases macOS Ventura 13.4 with new sports-related features
https://9to5mac.com/2023/05/18/macos-ventura-13-4-now-available/
When is Adobe going to officially comment on this situation for Mac users of Lightroom and Photoshop?
Copy link to clipboard
Copied
You've already been told - Rikk is Adobe, so his comments are Adobe official comments.
At least do him the simple courtesy of reading what he writes.
Copy link to clipboard
Copied
Adobe and DxO have found bugs in the Neural Engine so are unable to use it. DxO PhotoLab up through Monterey could use it, but they found a bug over 6 months ago in Ventura so they warn people to use the much slower GPU. Adobe has also found a bug (unknown if it is the same as the one DxO found or another one) so they have disabled the code in Lightroom that uses the Neural Engine for all versions of MacOS. Both Adobe and DxO say they have been waiting for Apple to fix the problems. There have been 6 Ventura updates since DxO found the problem, but none of those updates included a fix from Apple.
Adobe info concerning the Neural Engine problem(s):
Eric Chan, one of the longtime senior Adobe software engineers, wrote this article about Denoise AI:
Denoise demystified
https://blog.adobe.com/en/publish/2023/04/18/denoise-demystified
Finally, we built our machine learning models to take full advantage of the latest platform technologies, including NVIDIA’s TensorCores and the Apple Neural Engine. Using these technologies enables our models to run faster on modern hardware.
But, then Ian Lyon wrote:
https://community.adobe.com/t5/lightroom-classic-discussions/denoise-ai-in-12-3/m-p/13739400
Currently, the Apple Neural Engine found in M1/M2 silicon Macs is not used by Denoise. It's expected that performance will be even better when Denoise does use the Apple Neural Engine.
There is no change to above comments regarding the use of Tensor cores in NVidia cards. That is, Tensor cores are used by Denoise.
Apparently, an issue on Apple side means that the Neural Engine is not used by Adobe Denoise. When the issue is addressed, then Adobe Denoise is ready to take advantage of the Neural Engine.
And then Rikk Flohr confirmed it:
Ian Lyons information is correct regarding the Neural Engine on Apple devices
Rikk Flohr - Customer Advocacy: Adobe Photography Products
Here is what DxO tells their users (see red notice at bottom):
PC/Nvidia//Windows users do get the full performance from Lightroom/Photoshop and DxO PhotoLab. Unfortunately, us Mac users get much lower performance because the Neural Engine cannot be used. Apple Silicon DxO users report that using the Neural Engine is 4 times faster than using the GPU. DxO DeepPrime and Lightroom Denoise AI need lots of computing power so not being able to use the very fast Neural Engine is a big disadvantage for Mac users.
Copy link to clipboard
Copied
@Rikk Flohr: Photography: Why did you create a duplicate post for me here and then delete my new thread?
Why is Adobe trying to keep this Neural Engine issue hidden buried deep down inside of an unrelated thread? This should be a separate thread so I created one. Please restore my new thread and feel free to delete all my posts in this unrelated thread. Thank you.
Copy link to clipboard
Copied
Your original post on this issue was made in this thread. Your duplicate posts will be merged to here as well. There is no nefarious agenda to bury your posts. This is the way duplicate posts are handled in the forum.
I see you are also copying and pasting the same post in lightroomforums.net. I will leave it to the Community Experts and Legends to advise you on the netiquette of this cross-posting practice.
As this thread is about your response to an Apple issue, I would recommend your posting in an Apple forum. Have you? What did they say?
Lastly, I would recommend you review the Terms of Use link at the bottom of the forum. It may guide you toward better results in your posting.
Copy link to clipboard
Copied
As pointed out, it is the GPU that matters and whether it has dedicated AI/ML cores. on my M1 Max it indeed takes 10-15 seconds for a Z7 raw file but the GPU cores and the AI/ML dedicated cores are crazy on those machines. You're not going to see that with anything older than a few years unfortunately. Also, the quality of the denoise is lightyears (sorry for the pun) ahead of Topaz denoise or even on raw files using Photo AI in my testing with far fewer artifacts. Quite a bit slower but far better. Adobe's enhance includes the demosaic step in the calculations which is one reason why it is slower but much better quality.
Copy link to clipboard
Copied
I have to revise those numbers above slightly (repeated with stopwatch). 60 MP at 33 seconds.
When I get into work later I'll try with the older Quadro P2200 GPU on that machine, I expect several minutes there on the same files.
EDIT: Indeed. 3 minutes and 15 seconds, on the very same file.
Conclusion: you may not need a very fancy GPU, but you need a new one.