With "Use GPU for image processing" checked, Lightroom uses >75% of available VRAM and will not release it after completing edits. Specific issue: Transferring RAW files to DXO Photolab for DeepPrime noise reduction results in processing times going from 7 seconds for a 20MB files to six or seven times that.
Lightroom version: 12.1
OS: Windows 10 22H2
GPU: AMD RX 5500, 4GB VRAM
To reproduce the issue:
1) With "Use GPU for image processing" and "Use GPU for export" in Lightroom unchecked, send file to Photolab via plug-in.
2) Process file in Photolab with GPU enabled, apply DeepPrime, export DNG back to Lightroom. Note export time (in my case,7 seconds).
3) Check "Use GPU for image processing" in Lightroom and make adjustments to any file in the Develop module. Return to the Library module.
4) Export a new file via plug-in to Photolab. Repeat step 2. Export time back to Lightroom (or to another program, or to disk) is now 42 seconds or longer.
Windows performance monitor shows Lightroom using around 2GB VRAM with just "Use GPU for display" checked. With "Use GPU for image processing" enabled, VRAM usage is 3GB or more and is not released until GPU image processing is disabled or until Lightroom is closed.
I don't believe what you're seeing is a bug. Instead, it's a reminder of how VRAM hungry LrC is when the GPU is operating in Image Processing mode, and it's even more hungry when full acceleration is enabled.
Note that 4GB of VRAM is the minimum for Imaging Processing acceleration, and 8GB is the minimum when full acceleration is enabled. You may also wish to scroll down to the Mac requirements and see that for shared memory systems (i.e. M1/M2) the minimum for full acceleration is doubled to 16GB.
It appears, from reading other threads on this topic, that the issue is not so much the amount of VRAM available (in some cases 8GB or more) but that Lightroom seizes the lion's share and doesn't release it when another application calls for it, even when Lightroom is not actually using it. For instance, in Windows performance monitor, you can see VRAM usage climb when Photolab grabs whatever it can when using DeepPrime, then revert to baseline when the operation is finished. Different programs with different coding, but Lightroom's behavior here hardly seems optimal.