• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
2

P: (Win) Consumes RAM during import and fails to release afterwards

Explorer ,
Aug 22, 2024 Aug 22, 2024

Copy link to clipboard

Copied

I notice that LR eats all memory when you start importing images.

When opened, it takes about 1.5-3GB ram. Then import 100 images, and RAM shoots up to +17GB

 

Iv been doing some tests with smaller raw files, same thing, always goes up to 17GB ram usage. The folder i imported as test contains only 5GB of raws, and LR uses more then twice that amount to import those images? When you delete the images, 17GB ram is still in use. Then tested less images, like 30, and memory still goes up, but less, around 5GB ram extra, for images that on disk only take around 1GB. Importing more images, like +300 will not make memory to increase more then 17GB.

 

Solution? Import images, restart LR to clear memory, edit... and ram usage will stay "normal" around 6GB during edit after some testing. Thats 10GB less then just doing an import. Whats up with that?

 

Im on 13.5 btw

 

[Moved from ‘Bugs’ to ‘Discussions’ by moderator, according to forum rules.]

Bug Investigating
TOPICS
Windows

Views

5.9K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 1 Pinned Reply

Adobe Employee , Sep 06, 2024 Sep 06, 2024

I've opened a bug for the team to review. They may contact you directly for more information. 

 

Thanks for your report and the refinements. 

Status Investigating

Votes

Translate

Translate
replies 182 Replies 182
182 Comments
Community Expert ,
May 23, 2024 May 23, 2024

Copy link to clipboard

Copied

@steveisa054 

Mac or Windows has nothing to do with this. It's basic system architecture.

 

There is apparently a bug where the GPU memory consumption increases uncontrolled, or at any rate much more than it should. There are several threads about this. It happens to Mac users as much as Windows users. It's just that with shared system RAM, it grinds the whole machine to a halt much quicker.

 

This happens to some users, but not to most users. I have never seen it on my two Windows desktops.

 

16 GB RAM on a laptop with shared system memory is not enough. Again, it has nothing to do with platform. It will work for a while, but sooner or later you will hit the ceiling because there will be competition for resources. Any computer with shared system memory should have at least 32GB RAM at today's standards.

Votes

Translate

Translate

Report

Report
Explorer ,
May 23, 2024 May 23, 2024

Copy link to clipboard

Copied

It's happen on all my computer with graphic card (2ryzen desktop and one intel laptop).

And everything work perfect on my macbook air M1 8go ....

 

As soon i start to use local adjustement or AI thing, change picture and past the previous preset Vram start saturate and never go down. Also do with the new dupplicate AI tool.

 

So the point is I can use the classic cursor but not the improvement we have since V11 or V12.

 

Also the ram never go down but with 32go it's not saturate.

 

The only option is to not use the graphic card i bought for LRC and loose time and money.

Or close and open LRC every ten minutes.

 

Cool to have new feature but if you cant use them what the point.

Votes

Translate

Translate

Report

Report
Community Expert ,
May 23, 2024 May 23, 2024

Copy link to clipboard

Copied

quote

…it seems that a MAC user can be happy with only 16 GB ram. Is there really such a great difference between Windows and MAC with regard to this issue with Lightroom memory usage?

By @steveisa054

 

I am a very dedicated Mac user, and yet, I have to say that is an oversimplification that is partially, but certainly not always, true. Personally I recommend 32GB memory for someone getting their next Mac or PC for Creative Cloud work.

 

Things did change a little when Apple switched the entire Mac line from Intel processors to their own new Apple Silicon processor. There are some architectural processor differences that do use memory somewhat more efficiently. For example, there are tasks that everybody agrees are not advisable on Intel with 8GB RAM, but when people tried those tasks on the first Apple Silicon Macs with 8GB memory, those tasks had acceptable, or even good, performance. However, that is not always true, especially when large data sets need to be held in memory (these tend to be programming/data science tasks that Creative Cloud users never do); if you do something that really needs 16GB then the computer needs to have 16GB.

 

As for whether Mac users never see memory problems,  that is not true. Below are some relatively recent links on this forum from Mac users puzzled by mysteriously high memory usage with Lightroom Classic, describing it much like the Windows users do.

 

https://community.adobe.com/t5/lightroom-classic-discussions/ram-usage-on-apple-silicon/td-p/1411317...

https://community.adobe.com/t5/lightroom-classic-discussions/ram-amp-swap-usage-100gb-over-30-minute...

https://community.adobe.com/t5/lightroom-classic-discussions/lightroom-crop-tool-eats-up-ram-too-fas...

 

I have not had those kinds of problems with Lightroom Classic, but I recently had a similar problem with Adobe Premiere Pro on my Mac, where the claimed memory usage was several times the actual memory installed. (That is possible because of the way modern memory systems use compressed memory and virtual memory.) Eventually the program ground to a halt.

 

Adobe has been working on Lightroom Classic memory management in recent updates, and I think there may be fewer reports than there used to be. But there are still reports, so the problem might not be completely solved. The point is that it is incorrect to assume that Macs never have these problems. The solution is not to buy a Mac, but for Adobe to fix any remaining memory management bugs.

Votes

Translate

Translate

Report

Report
Community Expert ,
May 23, 2024 May 23, 2024

Copy link to clipboard

Copied

quote

@steveisa054 

And then you may get competition for resources. The operating system may have allocated, say 12 GB for the GPU, and then suddenly LrC needs 10. And it doesn't get it, because the OS won't release it.

By @D Fosse

 

It isn’t that simple. Although I’m not technical enough to explain all the details, one major factor is that Apple Silicon Unified Memory allocations between OS and graphics are said to be far more dynamic and flexible than Intel Integrated Graphics shared memory, so that it is not correct to assume that they operate with the same limitations. 

 

In the scenario you mentioned, most of the time macOS will have options for dynamically reallocating memory so that ths OS in fact can release it for the process that requests it. It should be relatively rare that the OS is actually unable to release enough memory. This is how I can end up with 20 or 30 icons for running apps in the Application Switcher, including Photoshop, Lightroom Classic, and one or two Adobe video apps, but memory usage is reported to be completely OK and things are working fast.

 

And all of that does not apply to this discussion, because in this thread Lightroom Classic or macOS is not handling memory properly. It’s more about memory management bugs that need to be fixed, not about the design of Windows or Mac memory management. And it’s not about having insufficient memory, because many of the systems have 32GB or more.

 

In the end we do agree that for Creative Cloud, 16GB should be considered the minimum level for a Mac or PC, and 32GB is recommended especially for someone who wants accommodate occasional larger jobs or use the computer for more years before upgrading.

Votes

Translate

Translate

Report

Report
Community Expert ,
May 23, 2024 May 23, 2024

Copy link to clipboard

Copied

@Conrad_C 

OK, "the OS won't release it" is obviously not how it actually works, but I was making a point and trying to simplify.

 

The point is that with shared system memory, whether you call it "unified" or not, the GPU will eat up a big chunk of it, leaving that much less for the application.

 

On my desktop/RTX 4060 Ti with 16 GB dedicated onboard VRAM, the GPU routinely uses 10-12 GB. That's normal Lightroom operation. It doesn't go above that plateau, and if e.g. Photoshop needs it while Lightroom is running, it gets reallocated. It never maxes out, I have never seen that (and it's easy to check).

 

I understand, based on reports here in the forum, that some people are hit by a bug where the GPU doesn't cap memory usage. I don't see it, but some do. I don't know what triggers it.

 

So we're not discussing Mac or Windows. We're discussing one "normal" situation and one platform-independent "bug" situation. These two seem to be mixed up in this thread.

 

The main point I was trying to make, to the "normal" situation, is that with 16 GB total installed RAM, and the GPU using perhaps half of it or more, you have a situation with limited resources. With 32 GB, that limitation is not so urgent.

 

With a dedicated GPU with onboard VRAM that doesn't eat into system RAM, 16 GB would work better.

Votes

Translate

Translate

Report

Report
Community Beginner ,
Jun 02, 2024 Jun 02, 2024

Copy link to clipboard

Copied

Throwing my hat on the pile too as I have the same issues I think I have narrowed it down to the masking as well.  I'm running 32gb, an RTX 3060 with 12gb, and an i7-12th gen.  That fact that the memory just keeps going up instead of releasing back after you're done makes me remember back to W98 when you'd have memory leak issues.  Hopefully there's an update in the future that corrects this but I've been seeing it for some time.  I bought another 32gb a few weeks ago - haven't installed - because of this.  Glad to see some people with 64 reporting issues going away.

Votes

Translate

Translate

Report

Report
Explorer ,
Jun 11, 2024 Jun 11, 2024

Copy link to clipboard

Copied

I have the beginnings of a solution. It's ridiculous but given the time I waste editing with the latest version it's better than nothing.

 

I'm a wedding photographer so I make a new catalog for every wedding.
I use v12 to sort and retouch my images (general adjustment and automatic mask)
When I'm finished I download the latest version and do the noise reduction on it.
And I go back to v12 as soon as a new catalog is processed.

 

I also have problems with loss of ram and vram on v12 but I can continue to work without restarting the software every 5 photos, it's much smoother.

 

If that helps for some while waiting for Adobe to correct this problem.
Good luck to all.

Votes

Translate

Translate

Report

Report
Community Beginner ,
Jun 13, 2024 Jun 13, 2024

Copy link to clipboard

Copied

Hi Everyone,

I'm another frustrated LrC user. I have the same problem that you guys have described. I even uninstalled Adobe apps and did a fresh install and it didn't work. My PC has a configuration that is more than enough to run LrC pretty smoothly (Windows 11, i9-10900Kf, 64Gb RAM, 2 Tb m.2 and RTX3070). I confirm what others have said about RAM usage as mine is always above than 60%. Hope Adobe fixes this asap.

Votes

Translate

Translate

Report

Report
Community Beginner ,
Jul 15, 2024 Jul 15, 2024

Copy link to clipboard

Copied

Just developed this issue as well. No rhyme or reason, but LR almost completely unusable now. And, not sure if others are having this as well, but without doing anything, not only does LR system usage skyrocket, but it's frantically writing/reading from my Pegasus RAID system, nonstop. No idea what is going on, as I'm not doing anything - just in library view.

 

I, too, hope Adobe will do something about this, but if history and experience is any guide, they won't, or at least not anytime soon.

Votes

Translate

Translate

Report

Report
New Here ,
Jul 21, 2024 Jul 21, 2024

Copy link to clipboard

Copied

I've had the same issue. i9-1100K 32GB of DDR4 3200MHz and a 4070ti but as soon as I start masking my RAM usage goes through the roof and my whole computer becomes a laggy mess. As a temporary fix I've dissabled my GPU in lightroom, seems to do the trick but obviously not ideal. 

Votes

Translate

Translate

Report

Report
Community Beginner ,
Jul 21, 2024 Jul 21, 2024

Copy link to clipboard

Copied

FWIW I moved to a mac studio ultra, and I haven't had the problem since - not sure why!

Votes

Translate

Translate

Report

Report
Explorer ,
Jul 23, 2024 Jul 23, 2024

Copy link to clipboard

Copied

I have a possibly related problem: Lightroom C gets slower through a session. This is particularly troublesome  - and noticeable - with AI denoising. Start of session, file whizzes through, after an hour or so of busy editing with a lot more denoising, SAME file, same edits, same denoise settings, takes 4-5 times as long. This slowdown also affects the Photoshop AI brush. Aside from LrC & PS I have nothing else running. Agree, a bug! Adobe please investigate and resolve.

Votes

Translate

Translate

Report

Report
Community Expert ,
Jul 23, 2024 Jul 23, 2024

Copy link to clipboard

Copied

This is GPU-related. Denoise runs exclusively in the GPU, it does not touch the CPU or the rest of the system.

 

What GPU do you have? Is this a dual graphics laptop?

Votes

Translate

Translate

Report

Report
Community Beginner ,
Jul 23, 2024 Jul 23, 2024

Copy link to clipboard

Copied

Are you saying Denoise doesn't affect/use ram? 

Votes

Translate

Translate

Report

Report
Community Expert ,
Jul 23, 2024 Jul 23, 2024

Copy link to clipboard

Copied

Actually Denoise doesn't use much memory at all. It seems to clear memory while in progress, both system RAM and VRAM. The arrows are Denoise start/finish:

 

RAM.png

VRAM.png

Votes

Translate

Translate

Report

Report
Community Beginner ,
Jul 23, 2024 Jul 23, 2024

Copy link to clipboard

Copied

Interesting. 

Votes

Translate

Translate

Report

Report
Community Expert ,
Jul 23, 2024 Jul 23, 2024

Copy link to clipboard

Copied

I forgot where I read this but apparently, AI Denoise does need memory, but only enough to do the images at hand. Once it finds enough memory for that job, adding another 16GB or 32GB or 128GB of memory is not going to speed it up much further. The main thing AI Denoise wants is GPU power, and lots of it.

Votes

Translate

Translate

Report

Report
Community Expert ,
Jul 23, 2024 Jul 23, 2024

Copy link to clipboard

Copied

Yes, and it's possible the dips in the graphs above are to make room for batch Denoising. Haven't tested that.

Votes

Translate

Translate

Report

Report
New Here ,
Jul 23, 2024 Jul 23, 2024

Copy link to clipboard

Copied

I’ve added a 2nd NVME M2 SSD to my workstation as my c:\ drive was getting full. I then moved the LR Catalog to the new 2TB NVME drive. My workstation runs 64gb ram and I have nvidia GPU graphics card with 8gb memory. The changes have helped a little but memory still backs up and slows down to a crawl after running through several dozen photos including Denoise. This is without doing ANY masking functions.

 

Restarting LRC helps until the next several photos backs up the memory again.

 

I can only assume there is a major memory leak issue that has not been fully addressed yet.

 

My computer is as follows:

 

Win10

ASUS Pro WS X570 ACE ATX -

Workstation Motherboard

 

AMD Ryzen 7 3700X -

8 core 16 Threads

 

G.SKILL Ripjaws V Series 64GB (2 x 32GB) 288-Pin PC RAM DDR4 3200 (PC4 25600) Desktop Memory Model F4-3200C16D-64GVK

 

Samsung E-1TB 970EVO+NVME M2

and

Samsung E-2TB 970EVO+NVME M2

 

EVGA RTX2070 XC Black w/8gb memory GPU graphics accelerator card.

 

Ive tried both Auto and Manual modes in the Performance section with no joy. 

Votes

Translate

Translate

Report

Report
New Here ,
Aug 02, 2024 Aug 02, 2024

Copy link to clipboard

Copied

I have the same problem, exactly as you described.

 

It's a bug. Early this year it was bad. Then LRC got updated and it was good. Now this current version that I'm using today, it's back to bad. It's a bug. Adobe, please fix this before I decide enough is enough and cancel my sub. You are killing the joy I get from my photography.

Votes

Translate

Translate

Report

Report
Community Expert ,
Aug 02, 2024 Aug 02, 2024

Copy link to clipboard

Copied

@Hasrin , I am assuming you are referring to @JoeFish2  post above, however JoeFish2 does not indicate the LrC version number he is using. So it follows I do not know the actual version number of LrC you are using. Please indicate the actual numeric number of LrC you are using.

Regards, Denis: iMac 27” mid-2015, macOS 11.7.10 Big Sur; 2TB SSD, 24 GB Ram, GPU 2 GB; LrC 12.5,; Lr 6.5, PS 24.7,; ACR 15.5,; (also Laptop Win 11, ver 23H2, LrC 14.0.1, ; ) Camera Oly OM-D E-M1.

Votes

Translate

Translate

Report

Report
New Here ,
Aug 02, 2024 Aug 02, 2024

Copy link to clipboard

Copied

No, I have the same problem as @steveisa054 

 

I'm using LRC 13.4 Release and Camera Raw 16.4

Build [ 202406181129-60d181b7 ]

Votes

Translate

Translate

Report

Report
New Here ,
Aug 09, 2024 Aug 09, 2024

Copy link to clipboard

Copied

Hello i'm having the same issue with this configuration on my computer : 

32 Go Ram DDR4

RTX 4060 16Go

SSD 1 To 

 

LRC V13.4

 

I cant figure out why i'm having this problem, I edit 4k videos on Davinci and dont have these issues. <profanity removed>

Votes

Translate

Translate

Report

Report
Community Beginner ,
Aug 10, 2024 Aug 10, 2024

Copy link to clipboard

Copied

I've got the same problem.

i5-8600K, RTX 3060TI, 2x16GB DDR4 3200MHz RAM, 1TB SSD M.2

Got 15-18GB usage of RAM.

 

It's funny when @D Fosse is talking about "good working LR" on laptops that have a half (or less) of the power ours PCs have and doesn't see that this IS a problem with the program itself, not with our PCs. Otherwise how the hell a laptop with integrated GPU and 8-16GB of RAM can be better in LR than a PC with a pretty good dedicated GPU and 32GB of RAM? It's simply impossible in a normal way. That is THE SAME performance that is used in games. I can have 500FPS and on this laptop he'll have 120FPS at max, and that's the difference between our specs. Don't you dare to say we have too weak PCs, cause that's not the problem if slower laptops works better in the same programs...

Votes

Translate

Translate

Report

Report
Community Expert ,
Aug 10, 2024 Aug 10, 2024

Copy link to clipboard

Copied

@XeoN_PL 

You're missing the point. Dual graphics tend to conflict. That's why it's not an optimal configuration.

 

Photoshop/ACR/Lightroom all use the GPU for actual data processing. It's not the simple one-way downstream flow it used to be, and still is in simpler applications.

 

You can't send data to one GPU and get it back from the other. So there can only be one GPU in this equation.

Votes

Translate

Translate

Report

Report