• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
2

Regarding my Lightroom Classic memory utilization surges.

Participant ,
Jan 10, 2024 Jan 10, 2024

Copy link to clipboard

Copied

I wrote some time ago that after changing my graphic card from a GTX 1660 to a Gigbyte RTX 3060, and pretty much at the same time updating to the latest 13.1 Lightroom Classic version, I started to have enormous problems with memory utilization when using Lightroom. Looking at Task Manager, Lightroom generally reaches around 6 to 8 GB motherboard ram utilization after working on a few photos. But then, within a single mouse click, my PC with 32 GB motherboard ram freezes and lags, and when I finally get control again and can open Task Manager, I see that Lightroom memory utilization has grown to up over 15 GB. Usually this happens while working with masks, but I am not at all certain of the true cause of the problem.

 

When this happens, I need to wait until I can get use of my PC mouse again, shut down Lightroom and restart it. On restart, working with the same image I was editing, the motherboard ram utilization is low at around 2 to 3 GB initially.

 

Mildly stated, this behavior is annoying. It affects my ability to use Topaz Sharpening and Topaz Studio 2 which I generally launch as a filter after doing an "Edit in Photoshop" from Lightroom. Sometimes I can get the Topaz apps to work, using the GPU, but often I need to either set Topaz preferences to use CPU only (way too slow) or shut down Lightroom before continuing to work with Topaz.

 

I did manage to achieve a consistently quick Lightroom denoise performance after switching to the new RTX 3060 graphic card. For this I am happy. But I do not know if the Lightroom memory utilization surges are due to the graphic card (yes, drivers for it are Studio and fully updated) or the Lightroom upgrade to 13.1.

 

I am currently working on an older Windows 10 PC equipped with internal SSD drives, 32 GB ram, the 12 GB RTX 3060 graphic card, and an i5-3570 CPU and MSI MS-7752 motherboard. Maybe I could make a guess that the problem is some kind of incompatibility between today's Windows, the graphic card, and the Lightroom and Topaz software and my older motherboard/chipset/CPU. I am working with an internal C: drive which has 1 TB size but is only 50% full, so lack of space should not contribute to the problem. My Lightroom Camera Raw Cache is set to 40 GB. I recently partitioned my working catalog so that currently my catalog only contains about 6000 photos.

 

I am in the process of building a new PC, where I intend to reuse my SSD drives and the RTX 3060 graphic card. Everything else will be new, with new motherboard, CPU, 64 GB DDR5 ram. I hope very much that when this new PC is being used, these memory surge problems go away.

 

I want to describe this situation in case anyone has been having a similar situation with the memory utilization surges, and maybe someone has found a solution.

TOPICS
Windows

Views

1.1K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
Jan 10, 2024 Jan 10, 2024

Copy link to clipboard

Copied

Are you driving your display using the GPU?

Are you using a high res display?

Can you switch the display to a MB display driver?

For some NVidia GPUs this seems to make a difference and even eliminates the slowdown as the GPU memory is consumed by Lightroom.  Lightroom uses the GPU for several editing functions having to do with selections and LR seems to build up consumption of the GPU memory that sometimes interferes with the GPU driving the display.  It all seems murky, but moving the display off the GPU seems to be effective for some people, especially if they are using a high res display.

I am running a 3090 with 24GB and a 2K display connected to the GPU (my MB does not have video outputs).  I never see a problem.  Others running 4090 with 24GB see the slowdown if they have a 4K or 5K display connected to the GPU, but do not see the problem if they move the display to the MB video.  I have not seem much about the 3060.

You can keep the Task Manaager running and monitor GPU memory use.

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Jan 10, 2024 Jan 10, 2024

Copy link to clipboard

Copied

Hi! Thanks for the input. Some of what you wrote is a bit beyond me, it seems. As for my display, I have a BenQ SW 270C monitor, 2560x1440. Big, but not a 4K monitor. "Am I driving my desplay using the GPU?" Not sure what is meant here. I suppose I am, by default. The display connection is attached to the graphic card directly. I am not aware that there is any other way to do this. What I am seeing when Lightroom memory usage surges above 15 GB is that Task Manager is showing me that the "Memory" column is this large, which is "physical memory in use by active processes". I am assuming here that this pertains only to the motherboard ram, meaning my installed 32 GB on the motherboard and not the 12 GB attached to the graphic card. But of course I am not wise here, what the Task Manager is actually showing. Maybe this column is somehow also including the graphic card ram usage?
I would not know how to move the display to the MB, or if that is even possible with my system.

I also want to point out, that this high memory utilization (nearing 100% utilization of available ram) was not observed before I installed the new graphic card (and I think I upgraded Lightroom to 13.1 at the same time). Only afterwards. And I also want to add that I am not seeing what I would describe as a "slowdown". I am seeing a total freeze of the PC, due to no available ram left and consequently no response until things get paged to the hard drives. Nothing responds, not even the mouse, until things get paged out and some memory is available again. I do not know if this is the same behavior seen by the others you speak of having a "slowdown" with their 4K and 5K displays.

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
Jan 10, 2024 Jan 10, 2024

Copy link to clipboard

Copied

You say you have your display monitor plugged into the GPU.  According to the specs I see, your MB has video outputs (e.g. HDMI).  Look at the connectors on the MB, the video output connectors should be in the same area as the USB connectors.  I suggest you move the display cable from the GPU to the MB and see if it makes any difference. 

 

You are running a 2K display which on the 3090 does not cause any problems, but I am not sure about the 3060.

 

When you check the memory use using Task Manager, click on "Performance", then click the GPU.  You will see Dedicated GPU memory, Shared GPU memory, and GPU memory (the sum of the other two).  The one that seems to be the problem is Dedicated GPU Memory use.  The Dedicated GPU memory serves both computational use and the display, and if computational memory use gets too large, slowdowns occur if the display is competing for dedicated GPU memory.  At least that's the theory advocated by many, but those who know (Adobe, Nvidia) are not talking.  Regardless moving the display to the MB is worth a try.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Jan 10, 2024 Jan 10, 2024

Copy link to clipboard

Copied

@DClark064 

That means bypassing the dedicated GPU (Nvidia RTX) and running everything off the integrated GPU - if there is one.  This mainly applies to laptops. In other words, it's the same as physically removing the dedicated high performance GPU altogether.

 

Very often, this will only make the problem worse, because integrated GPUs use shared system memory, so that instead of VRAM running short, the whole system gets choked.

 

In these cases the recommended course of action is to disable the integrated GPU so that there is only one GPU in the equation.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
Jan 10, 2024 Jan 10, 2024

Copy link to clipboard

Copied

I don't believe you need to use the GPU to run the display to be able to use the GPU to accelerate selections and other functions that LrC uses the GPU for.  See for example this thread: https://www.fredmiranda.com/forum/topic/1839582

 

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Jan 10, 2024 Jan 10, 2024

Copy link to clipboard

Copied

There's no difference between "running the display" and using the GPU for all the advanced GPU functions. The system knows which GPU is active.

 

Connecting the display to the motherboard socket means disabling the high performance GPU, and using only the weaker integrated GPU.

 

Again, if there even is one. The CPU may or may not have integrated graphics. If not, the mobo socket is dead.

 

In any case, this is a driver problem. High VRAM use is normal for LrC, I have that too - but it normally  doesn't cause any problems. It's just caching, it gets reallocated as needed. It will normally not choke the system.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Jan 16, 2024 Jan 16, 2024

Copy link to clipboard

Copied

I doubt highly if my old, and for that matter my coming new, motherboard has an integrated GPU. It is not the sort of thing I look for in a motherboard, seeing as I am using a dedicated higher performing GPU. Therefore, this option of somehow connecting my display to something other than my RTX 3060 graphic card does not seem to be possible in my case.

 

I do agree it is most likely a problem with the older MB and BIOS, somehow not handling the reallocation or release of memory correctly. Again, soon I hope I will have a completely new machine. Waiting for arrival of the DDR5 ram, and will order the new motherboard and CPU soon.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Jan 16, 2024 Jan 16, 2024

Copy link to clipboard

Copied

quote

I doubt highly if my old, and for that matter my coming new, motherboard has an integrated GPU. It is not the sort of thing I look for in a motherboard, seeing as I am using a dedicated higher performing GPU. Therefore, this option of somehow connecting my display to something other than my RTX 3060 graphic card does not seem to be possible in my case.

 

I do agree it is most likely a problem with the older MB and BIOS, somehow not handling the reallocation or release of memory correctly. Again, soon I hope I will have a completely new machine. Waiting for arrival of the DDR5 ram, and will order the new motherboard and CPU soon.


By @steveisa054

 

Yes, I do not understand at all that thought about not using a perfectly good GPU in favor of a integrated VIDEo control off the motherboard.

 

And yes, you should not have to disable the integrated video control in Windows System (devices)

 

And disabling in the BIOS? Probably not useful. Thoughts??

 

Your new MB will mostly likely have a integrated video control on it . Oh some do not, but then they typically run that via the CPU (if supported) just in case their is no GPU (or GPU is kaput)

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Jan 16, 2024 Jan 16, 2024

Copy link to clipboard

Copied

LATEST

I think there is some misunderstanding here.

 

Integrated GPUs are on the CPU die. It's not on the motherboard, just connected via the motherboard sockets. Some CPU lines have integrated graphics, some do not.

 

Both Photoshop and Lightroom are very sensitive to dual GPU conflicts, and the official advice is to disable the integrated GPU completely. From the official Lightroom troubleshooting guide:

"remove or disable the less powerful cards. For example, assume that you have two different cards using two different drivers—an NVIDIA graphics card and an AMD graphics card. In this case, ensure that Lightroom has been assigned the High Performance graphics card rather than Integrated Graphics or Power Saving graphics card."

 

GPU_LrC.png

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Jan 10, 2024 Jan 10, 2024

Copy link to clipboard

Copied

When you installed the GPU driver, did you select Custom install type as to force a clean install? If not, some old GPU driver code might still be lurking in the computer.

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Jan 10, 2024 Jan 10, 2024

Copy link to clipboard

Copied

Hi GoldingD, frankly I do not remember. This is a good tip, I can try to do a new install, and force it to be a clean one.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Jan 10, 2024 Jan 10, 2024

Copy link to clipboard

Copied

Also, while you use the GeForce Experience to update the driver, make sure GrForce Experience is up to date (it should tell you if not) and double check that GeForce is applying settings in the GPU to Optimize it when running LrC (it should be default)

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Jan 10, 2024 Jan 10, 2024

Copy link to clipboard

Copied

Hi, in the meantime I did the custom install of the driver, requesting a clean install. And I have done some Lightroom work since. This did not seem to help, as I already had one more episode of a memory utilization surge to over 15 GB, as before the driver clean install.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines