Copy link to clipboard
Copied
I am considering an upgrade of my current PC. The alternatives are 1) a Windows based setup with I7-13700 CPU and RTX 3060 or 2) a M2/M2 Pro based Apple setup.
I have seen some numbers of RTX 3060 performance with Adobe Denoise on 50 megapixel photos (< 1 minute), but have not found so many M2 / M2 Pro numbers.
Any experiences on Adobe Denoise on M2 / M2 Pro CPUs?
Copy link to clipboard
Copied
From what I've seen, numbers for RTX 3060 and M2 are pretty much the same.
For Apple, their "neural engine" is currently bypassed by Adobe because of bugs. Once that is sorted out, it should pick up speed.
For Nvidia, timings improve dramatically as you move up to the higher and more expensive model numbers. I have an RTX 3060, and I get 33 seconds for 60 MP files from an a7r v. I sent a file to a friend with an RTX 3090, and he reported back with 13 seconds, or 2.5x as fast.
Copy link to clipboard
Copied
My Macbook Pro M2 Max processed a 60 MP raw from a Sony a7r IV in 17 seconds, while an RTX 3060 did it in 30 seconds.
In the Lightroom Queen forum, Linwood Ferguson is collecting numbers on graphics-card Denoise performance using the same photo:
Copy link to clipboard
Copied
That's a bit faster than other M2 timings I've seen, converted for megapixel counts. But that's fine, all the better.
The implied point was that an RTX 3060 is a very modestly priced GPU, about the price of a 4TB NVMe just to put it in perspective. I bought it before I knew about Denoise and future Lr/ACR developments; had I known, I would have gone up to 3070 or 3080, if not 3090.
It's all in a cost/benefit tradeoff. You decide where break even is. My advice to anyone is a 3060-and-up because it works, and then you can go up as much as you're comfortable with.
Copy link to clipboard
Copied
Mac Stuio M1 Ultra has a 48 core GPU. Estimated time for 45MP Canon EOS R5 file is 10 seconds. Actual is about 15 seconds.
Copy link to clipboard
Copied
I dug into older files from older cameras. Here's the breakdown for the RTX 3060:
60 MP (a7r v) - 33 seconds
42 MP (a7r iii) - 23 seconds
36 MP (D 810) - 19 seconds
12 MP (D 700) - 8 seconds
I have assumed that this is directly proportional to MP count - but perhaps it isn't a strictly linear relationship. I haven't pulled out my calculator for this.
Anyway. As I've kept saying before: turning this into chasing seconds is pretty meaningless. Whether it's 24 seconds or 28 seconds doesn't really matter in the grand scheme of things. It's...a little while. It's all perfectly fine, it all works.
It gets much more painful when you're dealing with orders of magnitude slower, like three minutes, ten minutes, forty minutes. That's what you get with older GPUs from, say, 2019 and older. But the newer GPUs will all handle this well.
Copy link to clipboard
Copied
[This post contains formatting and embedded images that don't appear in email. View the post in your Web browser.]
"I have assumed that this is directly proportional to MP count"
It is linear:
Copy link to clipboard
Copied
My current system: MBP Pro 16" 2022 32-Core GPU | 16-Core Neural Engine/64GB
24.2MP Canon (R6 MII) takes 12 seconds from start to finish. Part of that is saving the new DNG, but the most processing is GPU related.
My understanding is M2 isn't going to be much of a factor and that this will all get even faster (M1 or otherwise) when Apple/Adobe fixes a bug that D Fosse mentioned.
Copy link to clipboard
Copied
I have seen some numbers of RTX 3060 performance with Adobe Denoise on 50 megapixel photos (< 1 minute), but have not found so many M2 / M2 Pro numbers.
By @Stein99
Even my base M1 Pro, a decidedly midrange previous generation Apple Silicon processor, can do AI Denoise in 20–40 seconds (on a 24 megapixel Sony image). Those other reports with the M2 and higher end M1 (Max/Ultra) seem to get AI Denoise time down below 15 seconds, even with the Apple Neural Engine not being enabled because of that bug.
With AI Denoise under 15 seconds possible on both Mac and Windows, the decision should probably be made more on which OS you feel more comfortable with, and what other software you need to run. For example, if you also do a lot of graphics or video editing a Mac is fine, if you also do a lot of 3D and gaming then the Nvidia GPU becomes a real advantage.
Copy link to clipboard
Copied
Not exactly the same hardware, but you might want to look at the RTX 3070TI. About 10 days ago, I did this test:
Today before my new nVidia RTX 3079TI card arrived, I did a carefully timed test on a 10,000 ISO image with lots of noise. The rig is a I7-8700K, 64GB ram and Win 10 pro. I had the noise reduction setting at 59. The first test was with the GTX1660S in the computer, the second is with the new RTX 3070TI:
1660S - Expected time 1 min, Actual time 1:10
3070TI - Expected time 10 sec, Actual time 11sec.
The exact same image was used for the comparison.
The 3070TI times are very similar to what I'm getting on the same image on my MBP 16" 64GB M1 Max that I principally use for travel.
The files are 46MP Z9 images.
Copy link to clipboard
Copied
MAY 17, 2023 3:18 AM PDT
Copy link to clipboard
Copied
Thanks for all input on this topic!
Find more inspiration, events, and resources on the new Adobe Community
Explore Now