In most recent Lightroom Classic, I'm not able to get the GPU to be used for AI Denoise. Lightroom is using the GPU for exports and editing, and it is showing the GPU is found (Radeon RX 6800 XT) in the system info and it's using Metal. But when I try to run AI Denoise, all of my CPU cores run at full throttle and my GPU doesn't spin up. It used to take 3-4 seconds to do an image (24MP Fujifilm RAF), and now it takes over a minute while my CPU is maxed out. I also see that my GPU is only pulling ~30W (basically idle), so I know it's not doing anything here. Anyone else experiencing this? Seems like a bug. Lightroom Classic version: 13.0.2 [ 202311290958-8ff975ea ]
License: Creative Cloud
Language setting: en-US
Operating system: Mac OS 13
Version: 13.6.1 [22G313]
Application architecture: x64
Logical processor count: 16
Processor speed: 3.4GHz
SqLite Version: 3.36.0
Built-in memory: 65,536.0 MB
Real memory available to Lightroom: 65,536.0 MB
Real memory used by Lightroom: 11,559.2 MB (17.6%)
Virtual memory used by Lightroom: 62,404.5 MB
Memory cache size: 29.5MB
Internal Camera Raw version: 16.0 [ 1677 ]
Maximum thread count used by Camera Raw: 5
Camera Raw SIMD optimization: SSE2,AVX,AVX2
Camera Raw virtual memory: 1102MB / 32767MB (3%)
Camera Raw real memory: 1358MB / 65536MB (2%)
Standard Preview Size: 1440 pixels
Displays: 1) 3840x2160
Graphics Processor Info:
Metal: AMD Radeon RX 6800 XT
Init State: GPU for Export supported by default
User Preference: Auto
... View more