Exit
  • Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
  • 한국 커뮤니티
0

Make Lightroom use more RAM

New Here ,
Mar 31, 2025 Mar 31, 2025

I'm trying to use the AI Denoise on quite a few images and I noticed Lightroom was only using 4GB of RAM. I changed the camera raw max cache settings to 64GB (I have 128GB available) and restarted the denoise and it still only uses 4GB of RAM.

 

Is there any way to make it use the maximum amount I'm allowing it to speed up the process?

TOPICS
macOS
133
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Mar 31, 2025 Mar 31, 2025

AI DeNoise uses the GPU. All denoise calculations are done there, so the amount of computer memory used is irrelevant to this process of denoising as RAM is not used for this step.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Mar 31, 2025 Mar 31, 2025

Using the maximum amount of Unified Memory probably won’t make much difference. The amount of memory that Lightroom Classic currently uses is what it decided it needs, and adding more won’t help. As an analogy, once you have a house large enough for your stuff, adding 20 more rooms to the house doesn’t improve your life at all and is wasted money. What I have seen that makes Lightroom Classic use more Unified Memory is processing a image with a higher megapixel count, or merging many multiple images into a very large panorama. 

 

In addition to the Adobe guidance about Denoise, it’s been studied and performance-tested on a wide range of PC and Mac systems on various web forums by a lot of people, and what we learned from that is the most important performance variable for Denoise is, by far, the power of the GPU. On Macs, Denoise performance tracks closely with the number of GPU cores. Based on this, you can expect the 80 GPU cores in the top spec M3 Ultra processor to process Denoise roughly 10x faster than the 8 GPU cores in a base M processor. And that ratio is not significantly changed by any other spec such as installed memory or CPU cores, because they wouldn’t help much for these types of calculations.

 

So if you are on a Mac and you want faster Denoise, the only solution that makes a real difference is to get a Mac with significantly more Apple Silicon GPU cores than your current Mac has. If you have an Intel Mac, its GPU might by now be old enough that it doesn’t work as well for Denoise as the more modern, AI-optimized GPUs in Apple Silicon Macs.

 

Does your Mac have an Apple Silicon processor, and if so, how many GPU cores does it have?

quote

I changed the camera raw max cache settings to 64GB (I have 128GB available) and restarted the denoise and it still only uses 4GB of RAM.

By @Nordin

 

That’s expected. The Camera Raw cache has nothing to do with RAM whatsoever. It’s a cache of temporary files on your Mac’s storage, caching Develop module edit data (so it is not the same as the previews cache in the catalog folder). It’s located at the path indicated next to the setting.

 

So when you increased the Camera Raw cache to 128GB, it now takes 128GB of SSD space, not Unified Memory.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Mar 31, 2025 Mar 31, 2025

Thanks! That makes sense. I've got an M4 Max chip with a 16‑core CPU, 40‑core GPU, and 16‑core Neural Engine.

 

So would increasing the Camera Raw cache also make no difference when exporting photos from Lightroom?

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Mar 31, 2025 Mar 31, 2025
LATEST

@Conrad_C: "the most important performance variable for Denoise is, by far, the power of the GPU."

 

Elaborating on this, the Denoise computations are done entirely within the GPU, using the GPU's memory only, and the amount of main memory is mostly irrelevant. It's possible that there is some benefit to having more GPU memory, though I haven't seen any published data about that -- it may be that as long as there enough memory to hold the entire photo in memory, there's not much benefit from having more.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines