• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

Lightroom not releasing VRAM with "Use GPU for image processing" enabled

New Here ,
Jan 01, 2023 Jan 01, 2023

Copy link to clipboard

Copied

With "Use GPU for image processing" checked, Lightroom uses >75% of available VRAM and will not release it after completing edits. Specific issue: Transferring RAW files to DXO Photolab for DeepPrime noise reduction results in processing times going from 7 seconds for a 20MB files to six or seven times that.

 

Lightroom version: 12.1

OS: Windows 10 22H2

GPU: AMD RX 5500, 4GB VRAM

 

To reproduce the issue:

1) With "Use GPU for image processing" and "Use GPU for export" in Lightroom unchecked, send file to Photolab via plug-in.

2) Process file in Photolab with GPU enabled, apply DeepPrime, export DNG back to Lightroom. Note export time (in my case,7 seconds).

3) Check "Use GPU for image processing" in Lightroom and make adjustments to any file in the Develop module. Return to the Library module.

4) Export a new file via plug-in to Photolab. Repeat step 2. Export time back to Lightroom (or to another program, or to disk) is now 42 seconds or longer.

 

Windows performance monitor shows Lightroom using around 2GB VRAM with just "Use GPU for display" checked. With "Use GPU for image processing" enabled, VRAM usage is 3GB or more and is not released until GPU image processing is disabled or until Lightroom is closed.

TOPICS
Windows

Views

251

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Jan 01, 2023 Jan 01, 2023

Copy link to clipboard

Copied

I meant to post this as a bug, but for some reason it got sent to Discussions instead.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Jan 01, 2023 Jan 01, 2023

Copy link to clipboard

Copied

You stated, “ 3) Check "Use GPU for image processing" in Lightroom and make adjustments to any file in the Develop module. Return to the Library module.“

As far as I am aware when you make a change in LrC Preferences you need to exit the application and reboot for the change to take effect.

 

Regards, Denis: iMac 27” mid-2015, macOS 11.7.10 Big Sur; 2TB SSD, 24 GB Ram, GPU 2 GB; LrC 12.5, Lr 6.5, PS 24.7,; ACR 15.5,; Camera OM-D E-M1

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Jan 01, 2023 Jan 01, 2023

Copy link to clipboard

Copied

I don't believe what you're seeing is a bug. Instead, it's a reminder of how VRAM hungry LrC is when the GPU is operating in Image Processing mode, and it's even more hungry when full acceleration is enabled. 

 

You can check the LrC System Requirements at https://helpx.adobe.com/uk/lightroom-classic/system-requirements.html

 

Note that 4GB of VRAM is the minimum for Imaging Processing acceleration, and 8GB is the minimum when full acceleration is enabled. You may also wish to scroll down to the Mac requirements and see that for shared memory systems (i.e. M1/M2) the minimum for full acceleration is doubled to 16GB. 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Jan 01, 2023 Jan 01, 2023

Copy link to clipboard

Copied

LATEST
  • It appears, from reading other threads on this topic, that the issue is not so much the amount of VRAM available (in some cases 8GB or more) but that Lightroom seizes the lion's share and doesn't release it when another application calls for it, even when Lightroom is not actually using it.  For instance, in Windows performance monitor, you can see VRAM usage climb when Photolab grabs whatever it can when using DeepPrime, then revert to baseline when the operation is finished. Different programs with different coding, but Lightroom's behavior here hardly seems optimal.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines