Copy link to clipboard
Copied
After about 30 minutes of using Lightroom Classic it seems that VRAM usage spikes to nearly all available and then the program goes absolutely chug in performance. When I first start it runs amazing, very snappy, but quickly this deteriorates to being unusable and I have to restart the program. I've noticed also that other programs on the computer will start to suffer as there is no vram/gpu available for anything else, and at least once it has caused a full on BSOD.
RTX 3080 - 1920x1080, Ryzen 3900X, 64GB DDR4 and on a Samsung 970 Evo+ SSD.
Copy link to clipboard
Copied
The issue you describe is known to Lightroom Classic engineering team and is currently under investigation. Typically, the issue is more common on systems with 4GB or more VRAM, and becomes progressively worse when GPU card is fitted with even larger amounts of VRAM. Unfortunately, there is no workaround at present.
BTW, I note that you've also submitted a report on the Lightroom Classic Feedback forum. This is helpful as it is much more likley to be picked up by Adobe staff than would be the case in this forum. You 'may' also find this your post gets merged into an existing thread already discusiining the issue.
Copy link to clipboard
Copied
Unfortunately, there is no workaround at present.
By @Ian Lyons
I see this too, but I have found a workaround - uncheck "use GPU for image processing":
This reduces normal VRAM usage from over 90% to around 25%, with no other discernible ill effects.
Now, I should stress that while I have the high VRAM usage, I haven't really had much performance problems. The only one that bothers me a bit, is actually seen in Photoshop - image windows tend to hang and "ghost" for a second or two when closing them. That immediately stops when I uncheck as per above.
It is not necessary to do the same in ACR, and it still runs with full GPU use. This seems to be very Lightroom-specific.
Copy link to clipboard
Copied
I know this thread almost two years old, but I can't for the life of god disable the second tickbox. Somehow it's greyed out. Lightroom chews roughly 9 GBs of VRAM when I'm in the develop tab. It has even gotten to the point, where I BSOD because i ran out of VRAM and my system needs to restart.
I'm on the latest version of LrC and my Nvidia Studio driver is up to date.
I'm using an RTX 3080 and Ryzen 9 7950X btw.
Copy link to clipboard
Copied
I know this thread almost two years old, but I can't for the life of god disable the second tickbox. Somehow it's greyed out. Lightroom chews roughly 9 GBs of VRAM when I'm in the develop tab. It has even gotten to the point, where I BSOD because i ran out of VRAM and my system needs to restart.
I'm on the latest version of LrC and my Nvidia Studio driver is up to date.
I'm using an RTX 3080 and Ryzen 9 7950X btw.
By @Moritz White
If your setting for use Graphics processor is set to Automatic, change it to Custom, any change?
Mind you, such a new GPU, should not behave that way.
Copy link to clipboard
Copied
It's already set to custom
.
Copy link to clipboard
Copied
Ah, perhaps, from:
"The GPU test carried at launch sometimes fails (typically after an app or OS update). The result being that acceleration modes that should function don't. The solution is to
Locate the Camera Raw GPU Config.txt file."
and see, solution 3 in:
Troubleshoot graphics processor (GPU) and graphics driver issues | Lightroom Classic | Adobe
Copy link to clipboard
Copied
Okay so I located the config file and deleted it, relaunched Lightroom, but it's still greyed out.
I could also not verify, that its set to OpenGL, as there was no indicator on my lightroom.
Copy link to clipboard
Copied
Okay so I located the config file and deleted it, relaunched Lightroom, but it's still greyed out.
I could also not verify, that its set to OpenGL, as there was no indicator on my lightroom.
By @Moritz White
For your problem, changing to OpenGL is not the goal, having LrC recreate those config files upon restart of LrC is the point. And upon LrC restart, it should be using DirectX, not OpenGL (for Windows). Solution 3 is normally for when LrC is mistakingly using OpenGl, those configured files get deleted as so to be recreated upon LrC restart. In the current problem, OpenGl is probably not the cause, it is probably already DirectX, but those config files may need to be recreated, hence this method of doing so.
Please post your System Information as Lightroom Classic (LrC) reports it. In LrC click on Help, then System Info, then Copy. Paste that information into a reply. Please present all information from first line down to and including Plug-in Info. Info after Plug-in info can be cut as that is just so much dead space to us non-Techs.
Copy link to clipboard
Copied
Lightroom Classic version: 12.2.1 [ 202303011008-5bfbce17 ]
License: Creative Cloud
Language setting: en
Operating system: Windows 10 - Business Edition
Version: 10.0.19044
Application architecture: x64
System architecture: x64
Logical processor count: 32
Processor speed: 4.5GHz
SqLite Version: 3.36.0
CPU Utilisation: 3.0%
Built-in memory: 31886.4 MB
Dedicated GPU memory used by Lightroom: 5709.7MB / 10067.0MB (56%)
Real memory available to Lightroom: 31886.4 MB
Real memory used by Lightroom: 2946.3 MB (9.2%)
Virtual memory used by Lightroom: 9806.9 MB
GDI objects count: 914
USER objects count: 2863
Process handles count: 7803
Memory cache size: 33.2MB
Internal Camera Raw version: 15.2 [ 1381 ]
Maximum thread count used by Camera Raw: 5
Camera Raw SIMD optimization: SSE2,AVX,AVX2
Camera Raw virtual memory: 907MB / 15943MB (5%)
Camera Raw real memory: 1176MB / 31886MB (3%)
System DPI setting: 96 DPI
Desktop composition enabled: Yes
Standard Preview Size: 3840 pixels
Displays: 1) 2560x1440, 2) 3840x2160, 3) 3840x2160
Input types: Multitouch: No, Integrated touch: No, Integrated pen: No, External touch: No, External pen: No, Keyboard: No
Graphics Processor Info:
DirectX: NVIDIA GeForce RTX 3080 (31.0.15.2849)
Init State: GPU for Export supported by default
User Preference: GPU for Export enabled
Application folder: C:\Program Files\Adobe\Adobe Lightroom Classic
Library Path: C:\Users\Morit\Pictures\Lightroom\Lightroom Catalog-v12.lrcat
Settings Folder: C:\Users\Morit\AppData\Roaming\Adobe\Lightroom
Installed Plugins:
1) AdobeStock
2) Flickr
3) Nikon Tether Plugin
4) Pixieset
Config.lua flags: None
Copy link to clipboard
Copied
(Just as a quick comment to my own post up there that Moritz replied to: I no longer see this behavior with excessive VRAM usage. It just stopped at some point, I would guess when I replaced the GPU for a newer one).
Copy link to clipboard
Copied
Graphics Processor Info:
DirectX: NVIDIA GeForce RTX 3080 (31.0.15.2849)
Init State: GPU for Export supported by default
User Preference: GPU for Export enabled
So still using DirectX, good
Driver version v528.49, per NVIDIA website, if that is the Studio driver, then it is current, hopefully good.
(fyi https://www.nvidia.com/download/index.aspx)
But, you are still having issues including BSOD?
Have you inspected the computer for excessive dust bunny's. Dust at inlets, outlets, around GPU, around CPU? That might cause overheating?
Is this GPU air-cooled or water-cooled? Any issues with either?
Copy link to clipboard
Copied
Yes, recently switched from game driver to studio driver and updated it a few days ago.
Copy link to clipboard
Copied
528.49 is a known bad driver version from Nvidia, identified as problematic by Photoshop engineering and discussed a bit in the Photoshop "bugs" forum.
Roll back or update.
Copy link to clipboard
Copied
528.49 is a known bad driver version from Nvidia, identified as problematic by Photoshop engineering and discussed a bit in the Photoshop "bugs" forum.
Roll back or update.
By @D Fosse
Link?
Copy link to clipboard
Copied
Sorry, it was 528.02. Not 528.49. My mistake. Still, updating the driver is the first thing to do in any case.
Here's one that came up in a search:
It's been several weeks and those threads are not easy to find now.
One more (Cory Shubert post buried deep):