Copy link to clipboard
Copied
My new MiniMac M2 Pro with 32 MB of memory is taking 45 seconds to denoise a 43 MP image. That seems excessive. Has any had experience with an Apple M2 Pro running denoise?
Todd
Copy link to clipboard
Copied
That might be slightly slower than expected, but maybe not out of line. The M2 Pro is only the second level up from the M2, the M2 Max and M2 Ultra are the real high performers for AI Denoise.
In general, it seems to work out like this from observing many reports, for a high megapixel raw file:
Intel Macs and PCs that are more than a few years old: Several minutes per frame
New-ish PC graphics cards, and mid-level Apple Silicon Macs: 1 minute per frame or more
The most expensive PC graphics cards and Apple Silicon Macs: 10 to 25 seconds per frame
All of those are influenced by both the number of pixels in the frame (more is slower); and on the Mac, the number of GPU cores in your specific configuration (more is faster).
Some references for you to compare with other configurations:
The graph below is from a video by ArtIsRight on YouTube. He has been running consistent tests as Mac M1/M2/M3 models have been released. The link below should jump to the point in the video where the test in the graph was done. The red marks on the table are his, made during his presentation.
https://youtu.be/SI8qCkGmI-w?si=RPwfABiua7hV0Lma&t=811
There is also a very long thread about AI Denoise and GPUs at the Lightroom Queen forums, this is a link to a table comparing various GPUs including an M2 Max. Everyone in the thread runs AI Denoise on their Mac or PC, using the same 60 megapixel test file provided at the beginning of the thread. You can try that file too and see how your results compare with theirs.
Copy link to clipboard
Copied
"with 32 MB of memory"
Not really relevant. What are your GPU details?
Copy link to clipboard
Copied
Copy link to clipboard
Copied
Adobe hasn't yet enabled the use of the neural engine on Apple Silicon processors so it is not using the full speed of these. That said, I have an M1 Max and it denoises 45 MP images (Z7 camera) in 15-20 seconds. Surprised a M2 Pro would be 3x slower but the M1 Max has many more GPU cores than the M1 Pro does so it is possible I guess. I also see that the biggest update between the M1 and M2 is the speed of the neural engine, which Adobe unfortunately doesn't yet use as said before. Topaz uses the neural engine for their denoise and that might be the reason why it is so much faster but I prefer the quality of Adobe's AI denoise.
Copy link to clipboard
Copied
Apple M2 Pro with 10‑core CPU, 16-core GPU, 16‑core Neural Engine
By @rtcary
GPU cores are what really matter for AI Denoise right now. Art’s chart shows an M2 Pro with a 19-core GPU turning in a time of 40 seconds for his 36MP test file. Based on that, your time of 45 seconds for a slightly larger 43MP image on slightly fewer 16 GPU M2 Pro cores seems consistent with what he got.
If you were hoping for significantly faster AI Denoise performance and are thinking about exchanging the Mac, what you would have to do is increase the number of GPU cores in the Mac. Art’s chart shows that it isn’t a matter of not having “the latest” because M1/M2/M3 matters less than the number of GPU cores. To cut your AI time by at least a third compared to an M2 Pro with 16 GPU cores, from the chart it looks like you would want to look into the M1/M2/M3 configurations on his chart that achieve between 20–30 seconds for his test file. If you could find a great holiday blowout deal on one of those configurations, maybe that would be a way to go.
Or, if that would cost too much money, you can wait…because it’s also possible that Adobe will find ways to cut AI Denoise time on the same hardware. This is possible because AI Denoise is currently slower than some of its direct competition, and because there are bugs in the Apple Neural Engine preventing its use for AI Denoise which it would normally be a natural for. If Adobe is able to optimize AI Denoise to be more competitive, and if Apple fixes its Neural Engine bugs, AI Denoise could get faster, but as users we don’t know when that might happen, or by how much.
Also to put the M2 Pro in context, I ran AI Denoise on the 60MP test file I mentioned earlier through my old 2018 Intel MacBook Pro, and it took three minutes. Apple Silicon has been a major boost for Mac GPUs.
Copy link to clipboard
Copied
Copy link to clipboard
Copied
For a basis of comparison, I have a Mac Studio with a M2 Ultra Chip. As a sports photographer who shoots in a lot of low-light gyms and fields and works on a deadline, LR Denoise performance is important to me. The file size is a major factor in the speed equation. I use Nikon Z9 High Efficiency RAW large files which because of the Tico Compression magic are only 22 MB from a 46MP camera. Uncompressed RAW doubles the file size and -- nearly doubles the LR Denoise speed. On average with my 22MB RAW files with an M2 Utra, I can Denoise an image at the the default 50 in ~17 seconds. I did a test batch of 100 pictures and that took ~22 minutes. You might want to click on GPU acceleration in the preference section of Lightroom if your Mac is spec'ed to take advantage of that feature. I find it helps a little -- not as much as I hoped, but every little bit helps.
Copy link to clipboard
Copied
High Efficiency RAW large files which because of the Tico Compression magic are only 22 MB from a 46MP camera. Uncompressed RAW doubles the file size and -- nearly doubles the LR Denoise speed.
By @Mike364044071xvr
That's very counter-intuitive - everywhere else in life, any compression increases processing time (it has to be decompressed before it can be processed).
Not that I doubt it, just curious (I don't use Nikon cameras). One explanation could be that decompression happens after initial processing, although I have no idea how that could work.
EDIT - oh, OK, doubles speed, not doubles time. Then everything is fine and the world is back on track 😉
Find more inspiration, events, and resources on the new Adobe Community
Explore Now