Copy link to clipboard
Copied
A little banchmark testing in search of the best GPU I am interested for Lightroom CC AI Denoise processing time with different Nvidia GPU.
Test Method:
Images: DP Review's A7RV 60M RAW files. Five cycles per test
System: Intel i7-6700K 4.0GHz, 32GB DDR4 2400MHz RAM, WD Black 1TB M2 SSD, Win10, 27" 1440p display, Antec (16 years old) P190 550W+650W (GPU use only) =1200W case
GTX1060 6GB GDDR5: 1-pic: 159s10-pics: 1569s Idle: 108W Average: 234W Peak: 279W
RTX3060 OC 12GB GDDR6: 1-pic: 32.08s 10-pic: Not tested Power: Not tested
RTX3060 Ti 8GB GDDRX6: 1-pic: 26.90s 10-pic: Not tested Power: Not tested
RTX3070 OC 8GB GDDR6: 1-pic 25.18s 10-pic: 221.73s Power: Idle 117W Average: 378W Peak: 585W
RTX4060 Ti 8GB GDDR6: 1-pic: 26.97s 10-pic: 247.62s Power: Idle: 108W Average: 288W Peak: 369W
RTX4070 12GB GDDRX6: 1-pic: 20.08s 10-pic: 180.2s Not tested Power: Not tested
RTX4070 OC 12GB GDDRX6: 1-pic: 19:74s 10-pic: 175.61s Power: Idle: 117W Average 324W Peak: 414W
RTX4070 Ti OC 12GB GDDRX6: 1-pic: 17.29s 10-pic: 148.81s idle: 117W average: 369W Peak: 441W
RTX4080 OC 16GB GDDRX6: 1-pic: 13.88s 10-pic: 120s 422-pic (my own 60MP images): 5330s Idle: 126W Average: 423W Peak: 576W Task Manager: CPU Average: 58% Memory: 40% GPU: 7% Power usage: High
Beside the Denoise process speeding up when testing the higher end GPU so does the refreshing speed of the 60MP image. During masking brush process at 100% zoom-in while navigating around the 60MP image it's almost instantaneous with RTX 4070 and above GPU while other cards takes a second or even a few seconds to refresh constantly from the pixelated image which makes the entire editing experience much more fluid and pleasant. Even though some GPU consumed less wattage they also take much longer time to process so the advantage is no longer there especially when I often process 50~200+ images at a time.
I hope the raw data will be helpful to someone who needs them
Copy link to clipboard
Copied
Awesome info! Thank you for posting your tests!