Skip to main content
Known Participant
June 18, 2025
Answered

Question: how much disk space is consumed by AI Denoising an image in Lightroom Classic 14.4?

  • June 18, 2025
  • 5 replies
  • 4802 views

Now that running AI denoise in Lightroom Classic no longer creates a new DNG, I assume it has to be storing a large amount of data somewhere.  Before I go crazy with denoise (the results are often amazing!) and consume many GB of storage in my Lightroom library files, I wanted to get an idea how much space this takes.


I have no way of measuring the library file size increase after a single operation, so I tried the same operation in Adobe Camera Raw (17.4) and found that denoising one 30.8MB Nikon D500 raw file created a new a 6.1MB .acr sidecar file.  So it appears that, at least in this case, denoise created an additional file that is about 20% the size of the original file.

 

Does anyone know if Lightroom Classic does something similar?  If it does, that means that running denoise on 5GB of raw files (only about 162 D500 files) will result in growing the library files by 1GB.  This is better than a whole new DNG, but it isn't insignificant, and would be part of library storage, not raw file storage!

 

As an additional question, will subsequently turning off (unclicking) denoise on an image in LRC eventually free up that additional storage requirement?

 

I use a Mac but the question would apply the same on Windows

 

Thanks in advance for any insight!

Don

 

Correct answer Rob_Cullen

This quote is taken from the link to the Lightroom Queen's blog-

https://www.lightroomqueen.com/whats-new-in-lightroom-2025-06/

"when we tested them on a 24 MB raw file, applying Super Resolution created approximately 48 MB of extra data, while applying only Raw Details generated around 18 MB of extra data. Denoise had the lowest impact, creating only about 5 MB of extra data. In Lightroom Classic, the new pixel data is stored in the .lrcat-data file alongside the catalog. In Lightroom Desktop Cloud mode, it’s saved to the cloud database."

 

 

5 replies

gg100
Participating Frequently
August 28, 2025

Same problem here, my caatalog for 133,000 images was 2.1 GB

I added about 500 images, all of which I used the DENOISE feature

My catalog went from 2.1GB to 8.4GB 

while storage of an 8.4 GB catalog is not a big issue for me right now, back up / loading etc all slow down with a very large catalog

I was able to "Fix" the issue by;

1 exporting all 500 images

2 creating a new folder in LrC

3 Import the exported images from step 1 into a new folder

4 Delete and remove all the original images 

5 optimize the cataloge

6 once lightroom restarted the catalog went back to the 2.1GB

 

Adobe needs to address this, as no other feature causes catalog bloat to this degree

DdeGannes
Community Expert
Community Expert
August 28, 2025

@gg100 , When this thread was started LrC was at v 14.4 since then it has upgraded to v 14.5.1.

What version of. LrC do you have installed on your computer (indicate the actual version number) ?

 

Regards, Denis: iMac 27” mid-2015, macOS 11.7.10 Big Sur; 2TB SSD, 24 GB Ram, GPU 2 GB; LrC 12.5,; Lr 6.5, PS 24.7,; ACR 15.5,; (also Laptop Win 11, ver 24H2, LrC 15.0.1, PS 27.0; ) Camera Oly OM-D E-M1.
gg100
Participating Frequently
August 29, 2025

Yes, this thread started at 14.4, as I believe that was when the new denoise feature was added

I am currently running 14.5.1 and have duplicated the problem with this version.

Inspiring
July 1, 2025

Are the AI enhanced images stored in the catalog forever or are they like a preview that get cleaned up after a while? 
I guess there would be no harm deleting them after a while other than the cost of recomputing them, since the source data still exists, so Lightroom should be able to re-compute them if needed.

 

 

I don't see why they should end up in the XMP, there should be an option do prevent that or just never happen at all.

C.Cella
Inspiring
July 1, 2025

@thomas5CED  the Ai Mask, the Generative Remove, the Enhance are NOT stored in the catalog but in the lrcat-data in a .blob

 

LrC Catlaog only has a reference to them so they can be loaded, no need to recompute them.

 

If you delte Ai Masks, the Gen Removes, the Enahce then they will be deleted from the lrcat-data I believe upon quitting LrC as part of a background optimisation of the lrcat-data database.

 

If you still keep histry then they will not be deleted I believe as the steps will have references to them.

This I need to verify but I belve that any .blob is kept as long as there's a reference to it in the catalog.

 

You can of course delte them and then you not only will have photos that appears different, incomplete, but will have to waste spend time recomouting them.

 

Frankly worry not about the size of the lrcat-data

If you ahve to worry about something then is XMP size that will be problematic as storing Enahce data in XMP will make saving keywords, title, captions in the sidecar even 10x times slower.

 

Participating Frequently
July 1, 2025

I dont agree that you can 'frankly not worry about the size of the lrcat-data. When you set backup destination for your catalog, it creates a zip file that, when expanded, contains the cat and the lrcat data. Hence, you cant ask Backblaze or dropbox to selectively back up one or the other. So, in the last two weeks since I first started using the AI denoise, and adaptive color profiles in earnest, my zip file has gone from about 2 GB to now 20 times that size file, and I have to wait for dropbox to shift about 40GB of data each time I back up. That is with one week of work. Imagine what this will be like in a year, particularly if you are using medium format files. It is a disaster in terms of data management. Have a look yourself in your catalog settings

ouiouiphoto
Known Participant
June 28, 2025

Deal all

 

Test with R7 raws 

 

In the XMP we are supposed to find metadata including dev metadata. If you do a simple dev your XMP is around 10Ko. If you add masks, you have to save in the XMP the mask drawing. This can take place.  But with a lot of masks,  you are still under 1Mo. But if you use Denoise by AI your xmp goes to 5Mo or more. Why to save the denoise result in the XMP ? Why not to save instead that denoise AI why used with the slider value like it is done for exposition. In the XMP you don't save the exposition result but just that is was used and the value of the slider. Why Denoise is not the same ? If anyone have an idea. Just to know why. Because personally i don't use xmp at all

 

Kind regards

.Sheepdog trying to help Lightroom and Photoshop beginners
Community Expert
June 28, 2025

Normally the denoise result is put in the catalog only. It only gets put as a copy in the xmp when you have automatic xmp writing turned on. This data has to reside somewhere. In previous versions of Lightroom, a separate dng file would get created with the denoise data. In the new version, the denoise data that allows you to charge the amount of denoising after the fact is stored in the metadata in the catalog. Denoising this way unavoidably adds some megabytes that have to be somewhere.

C.Cella
Inspiring
June 29, 2025

@Jao vdL I understand it for AI mask or mask in general to save the shape but less for Denoise result.  Anyway, for sure, if you want to put in the xmp all to avoid AI recalculation you have to put it inside. 

 


See this thread : https://community.adobe.com/t5/lightroom-classic-discussions/question-how-much-disk-space-is-consumed-by-ai-denoising-an-image-in-lightroom-classic-14-4/m-p/15380401#M406995

 

 

 

Known Participant
June 19, 2025

Oh no! 😮😣

Please, Adobe, store all of this data only with the RAW files!

 

My Lightroom Catalog already is about 1 TB in size, mostly due to (mostly standard sized, not 1:1) previews but also several GB for the data/db files of the catalog. The LR Catalog is stored on a high-end 2 TB SSD (just upgraded from 1 TB a few weeks ago), to make culling etc. fast enough.

 

I really DO NOT want to fill up this pricy storage with data that I or LR won't need the next few months/years after first processing. This kind of data belongs to archive/cold storage!

I mean I could manually move that .lrcat-data folder to my HDD and place a symbolic link or junction, so that LR doesn't notice. But this presumably would heavily slow down the AI processing while editing.

 

@Adobe Simply store all such data next to the RAW files.

 

My current projects (RAW + XMP files) are stored on another SSD for speedy edits and imports. The AI Denoise data would be there too. When I'm done, they get moved to external HDDs (consuming about 14 TB currently), not blocking scarce SSD space anymore.

 

I was so happy to hear about non-destructive AI denoise without monstrous DNG-files anymore. But now they broke the batch workflows (see other community threads) and clutter storage space... I will skip the 14.4 update. 😑

C.Cella
Inspiring
June 19, 2025

@robert36972564 For many ears all Ai mask and image data has been stored in the lrcat-data
That's where LrC keeps the Ai "images".

They "Enhance images" are are lighter than storing the actual image in a sidecar as the article that @Rob_Cullen has provided explains.


Your catalog is on a 2TB SSD, your .lrcat-data and will not be getting massively bigger anytime soon.
On the other hand you will have to pay attention to storage when storing the SuperRes into the sidecars.


I suggest to use VC to do SuperRes and any Enhance non destructive. 
Edit a VC and save it as snapshots ONLY when is needed to edit perhaps in ACR or LrD Local.
This way the SuperRes will not end up in the XMP until the moment is needed.

Skipping LrC 14.4 is not necessary, with a few precautions you can manage.

.

Known Participant
June 19, 2025

Assuming 5 MB of storage per ai denoised image, the remaining 900 GB will be enough for about 184.320 images. BUT not really as the 900 GB will be also slowly filled with further database and preview data.

(And take into account, that if the Catalog storage/SSD is full, you really cannot work with LR at all anymore until you fix that issue. - No more edits, imports, VCs, preview generation / viewing old images, cloud sync issues)

 

It won't be an immediate issue. Also not for the next year (if the assumption stays valid). But Adobe is wasting their user's money with this implementation and thats not a good thing.

 

If it is possible to store this data within 5 MB (or whatever) on the catalog drive, there can be no technical reason, why it wouldn't be possible to store the exact same data with similar compression, encoding and size as an additional RAW sidecar file or within XMP sidecars.

 

The enourmous XMP size it generates right now, that you and The Lightroom Queen reported, are not a good example, of course. I mean, I also work with 24 MP RAWs. A 27 MB RAW became a 55 MB DNG previously. Instead now we have a 5 MB catalog increase + potentially a 160 MB XMP increase. So I also support your request: Get this more efficient, Adobe, may it be with acr or other files.

 

A sidenote:

"with a few precautions you can manage" – Probably. But not acceptable. Much more than Photoshop (or ACR), Lightroom is a tool for fast and easy processing of fairly high volumns. I want to avoid every change that brings impaired workflows as they are annoying and costly. But this is more about the batch editing issues of other threads than about this topic here.

Also a software which's core is out there for decades and which's price was premium at all the times and was recently doubled, a user should not need many precautions (preferably almost none at all).

 

Sorry for sounding so pissed, cause I appreciate that you want to help.

This is just not an issue that needs any help from the community, this is an issue that needs a fix by Adobe. (And I'm perfectly fine if this fix comes in 14.5 or .6. They brought something new, they have to gather feedback and learn and improve. Few things can be perfect from the beginning.)

Rob_Cullen
Community Expert
Rob_CullenCommunity ExpertCorrect answer
Community Expert
June 18, 2025

This quote is taken from the link to the Lightroom Queen's blog-

https://www.lightroomqueen.com/whats-new-in-lightroom-2025-06/

"when we tested them on a 24 MB raw file, applying Super Resolution created approximately 48 MB of extra data, while applying only Raw Details generated around 18 MB of extra data. Denoise had the lowest impact, creating only about 5 MB of extra data. In Lightroom Classic, the new pixel data is stored in the .lrcat-data file alongside the catalog. In Lightroom Desktop Cloud mode, it’s saved to the cloud database."

 

 

Regards. My System: Windows-11, Lightroom-Classic 15.0, Photoshop 27.0, ACR 18.0, Lightroom 9.0, Lr-iOS 10.4.0, Bridge 16.0 .
DonHAuthor
Known Participant
June 19, 2025

Thank Rob, that's exactly what I was trying to figure out.  I suspect the ratio of denoise data to raw file size will vary by raw file type, degree and type of raw compression , etc.

Don

C.Cella
Inspiring
June 19, 2025

A SuperRes Image is way more than 2x the size of the RAW from which it orginates.

RAW file is 36,2Mb 
XMP no edits = 44Kb
XMP with SuperRes = 162,2MB

ACR saves Enhance data into a more efficient format the .acr

.acr with SuperRes = 129,7MB

The big problem is that LrC is inefficient at writing into XMP even on SSD.
This is how long it took to save the SuerRes data into the XMP form the moment of clicking "Save" to the moment the XMP is finally written.

 


LrC should adopt the .acr format to store this data which is both faster to read and write ( XMP requires all data be encoded as text) and is lighter than XMP.

For DNGs the SuperRes is saved in the file, in the inner-sidecar, so DNGs become massive.

Still not as big as with the baked SuperRes.

Looking at the .lrcat-data doens't fully reveal the impact of this big data for our storage capacity and workflows.