Copy link to clipboard
Copied
Now that running AI denoise in Lightroom Classic no longer creates a new DNG, I assume it has to be storing a large amount of data somewhere. Before I go crazy with denoise (the results are often amazing!) and consume many GB of storage in my Lightroom library files, I wanted to get an idea how much space this takes.
I have no way of measuring the library file size increase after a single operation, so I tried the same operation in Adobe Camera Raw (17.4) and found that denoising one 30.8MB Nikon D500 raw file created a new a 6.1MB .acr sidecar file. So it appears that, at least in this case, denoise created an additional file that is about 20% the size of the original file.
Does anyone know if Lightroom Classic does something similar? If it does, that means that running denoise on 5GB of raw files (only about 162 D500 files) will result in growing the library files by 1GB. This is better than a whole new DNG, but it isn't insignificant, and would be part of library storage, not raw file storage!
As an additional question, will subsequently turning off (unclicking) denoise on an image in LRC eventually free up that additional storage requirement?
I use a Mac but the question would apply the same on Windows
Thanks in advance for any insight!
Don
This quote is taken from the link to the Lightroom Queen's blog-
https://www.lightroomqueen.com/whats-new-in-lightroom-2025-06/
"when we tested them on a 24 MB raw file, applying Super Resolution created approximately 48 MB of extra data, while applying only Raw Details generated around 18 MB of extra data. Denoise had the lowest impact, creating only about 5 MB of extra data. In Lightroom Classic, the new pixel data is stored in the .lrcat-data file alongside the catalog. In Lightroom Desktop C
...Copy link to clipboard
Copied
Copy link to clipboard
Copied
That thread only partly covers the issue. Let me highlight with my own experience. I have a catalog containing more than 400,000 images. When I back up the catalog, once per week, the zip file produced contains two files when expanded. 1) the catalog file, and 2) the catalog data file. Even with 15 years of work, and 400k images, the cat data file was only ever about 85MB in size. Having just processed a batch of files from Fuji GFX 100sii, and batch AI denoised about 400 files (shot at ISO > 1200) that data file has now blown out to 36GB. It makes cloud back up much slower. The problem is that I now cant trim that file down at all. You can batch erase previews, but I cant see a way to batch erase the AI denoise data that is now clearly stored in that file. If this cant be resolved I am now stuck with a very large catalog data file that I cant fix. Would welcome any suggestions
Copy link to clipboard
Copied
@Paul McMurrick When saving into XMP a copy of the XMP data is stored in the catalog.
Usually is very light stuff but for Enhance is not the case.
• Optimise the catalog and the XMP data in the catalog will be trashed retiring it to more moderate size.
As I said I recommend to not save any Enhance into the XMP and just keep it in the VC till the very last moment when t will need to be made available in the XMP
.
Copy link to clipboard
Copied
Thanks. Tried this, made no difference. LR cat-data file is still 35GB
Copy link to clipboard
Copied
@Paul McMurrickSorry I thought you meant you catalog, not the lrcat-data
The lrcat-data is where the actual "image" of the Denoise is stored.
Check the first part of the thread where all is explained.
Copy link to clipboard
Copied
No, I was referring to the lrcat-data. I ended up deleting that data file and starting again. It contains all the data for ANY AI from the catalog, so not just denoise, but also the data for adaptive color. It ends up very very quickly assuming an enormous size. This is an extremely clumsy implementation by Adobe. They have to sort this quickly
Copy link to clipboard
Copied
This is an extremely clumsy implementation by Adobe. They have to sort this quickly
By @Paul McMurrick
To make all all the "Enhance Images" and AI mask available they must be stored somewhere...otherwise we would need to generate all of them anew, all the time
@Jao vdL explained it already.
So ACR stores them in the .acr sidecar since it has no database.
LrC has a database and store them in the lrcat-data
Copy link to clipboard
Copied
Yes, but the problem is that it includes this file in automated backup, which many of us backup to cloud services. Data files of that size rapidly become problematic. Surely it is better to not include this in automated backup, but be able to restore it from the catalog if required, otherwise there are weekly copies of this enormous file every week
Be assured, I understand how it works. I just hate that implementation
Copy link to clipboard
Copied
@Paul McMurrick What do you eman "it includes this file in automated backup"
I use Backblaze and I can choose to exclude specific files or folder form the backup.
.
Copy link to clipboard
Copied
I filed a feature request to store sidecar data more efficiently to adress this issue:
Copy link to clipboard
Copied
Are the AI enhanced images stored in the catalog forever or are they like a preview that get cleaned up after a while?
I guess there would be no harm deleting them after a while other than the cost of recomputing them, since the source data still exists, so Lightroom should be able to re-compute them if needed.
I don't see why they should end up in the XMP, there should be an option do prevent that or just never happen at all.
Copy link to clipboard
Copied
@thomas5CED the Ai Mask, the Generative Remove, the Enhance are NOT stored in the catalog but in the lrcat-data in a .blob
LrC Catlaog only has a reference to them so they can be loaded, no need to recompute them.
If you delte Ai Masks, the Gen Removes, the Enahce then they will be deleted from the lrcat-data I believe upon quitting LrC as part of a background optimisation of the lrcat-data database.
If you still keep histry then they will not be deleted I believe as the steps will have references to them.
This I need to verify but I belve that any .blob is kept as long as there's a reference to it in the catalog.
You can of course delte them and then you not only will have photos that appears different, incomplete, but will have to waste spend time recomouting them.
Frankly worry not about the size of the lrcat-data
If you ahve to worry about something then is XMP size that will be problematic as storing Enahce data in XMP will make saving keywords, title, captions in the sidecar even 10x times slower.
Copy link to clipboard
Copied
I dont agree that you can 'frankly not worry about the size of the lrcat-data. When you set backup destination for your catalog, it creates a zip file that, when expanded, contains the cat and the lrcat data. Hence, you cant ask Backblaze or dropbox to selectively back up one or the other. So, in the last two weeks since I first started using the AI denoise, and adaptive color profiles in earnest, my zip file has gone from about 2 GB to now 20 times that size file, and I have to wait for dropbox to shift about 40GB of data each time I back up. That is with one week of work. Imagine what this will be like in a year, particularly if you are using medium format files. It is a disaster in terms of data management. Have a look yourself in your catalog settings
Copy link to clipboard
Copied
Has the .lrcat file significantly increased in size or is it just the contents of the .lrcat-data folder?
If you save to xmp with the new Enhance, the .lrcat file quickly gets massively bloated because the xmp data is also saved inefficiently (and I believe unnecessarily) in the catalog database.
Copy link to clipboard
Copied
just the data file. The cat file is fine
Copy link to clipboard
Copied
@Paul McMurrick you will have to only copy the Catlog to Dropbox and not the lrcat-data.
The LrC backup process is bugged and inefficient when either the catalog OR the lrcat-data become big.
So.form now on even if perhaps annoying is best to just copy and paste manully to dropbox.
.
Copy link to clipboard
Copied
@Paul McMurrick for years we ahve asked the LrC team to let us decide what to include in the backup process.
For instance many would want also the Settings folder to be included as it keeps all our valuable presets.
So upon backup we could choose to include
lrcat-data
Settings (folder)
I personally stopped using LrC backup process.
Copy link to clipboard
Copied
C Celia TBH i think that you are blindly supporting Adobe here when clearly they have screwed this up. The automated backup system has worked well for more than a decade. You explained above how you have used backblaze to backnup off site. It is now a train wreck. Call it for what it is.
just dragging the cat file across is no good. Rey deleting the data file, then try to reconstruct it and see how you go. All AI edits are lost. It needs to be backed up, but not as a bloated mess
Copy link to clipboard
Copied
@Paul McMurrick if the team had actually found a way to store the Enhance data in a compact way we would have it alredy.
We would not be here having this conversation.
But for now the data is big, which is a commom problem/reality in the digital word.
There is no "screw up" here as far the lrcat-data goes.
The .blob in the lrcat-data database are as compact they can get.
The Enahce data in the XMP and .lrcat is instead even bigger.
That is a problem but I fear won't be addressed anytime soon.
.
Copy link to clipboard
Copied
Disagree. They need to urgently develop a data management system for AI files the way they have for previews. Ie batch produce, batch delete, auto delete after producing deliverable jpgs etc. Otherwise, the catalog files will grow exponentially, almost as fast as the library
Copy link to clipboard
Copied
> @thomas5CED the Ai Mask, the Generative Remove, the Enhance are NOT stored in the catalog but in the lrcat-data in a .blob
It's still consuming space 🙂 What makes this data different from previews? That are stored up to a size limit and then cleaned up. My point is that this generated data can be re-generated, so why keep it around indefinitely?
> You can of course delte them and then you not only will have photos that appears different, incomplete, but will have to waste spend time recomouting them.
Is that actually true? So LR will not just regenerate missing data automatically?
Copy link to clipboard
Copied
Yes, appears true. There is no way I can identify to purge it through the software. You can just delete the lrcat data file, but when I did that it didn't regenerate the file. You have to go back to each image and manually re add the AI steps
it is a disaster. For anyone with large files who uses AI, for any reason, this file grows exponentially. Adobe need to develop methods for management as they have with preview files, ie batch creation or delete etc
Copy link to clipboard
Copied
Not tested yet, but shouldn't it work to:
I do agree, that I would prefer the same kind of treatment as we have with previews. I.e. it would be nice if the AI stuff would get its own Folder and/or subfolder in the lrcat-data-Folder and in the ui we get a simple "clean all AI info" or "remove AI information from selected photos"
Copy link to clipboard
Copied
I discovered that unfortunately removing any trace of Enahce will not make LrC trash the blob in the lrcat-data
This was not expected.
.
Copy link to clipboard
Copied
I filed a feature request to clean up and re-generate AI blobs: https://community.adobe.com/t5/lightroom-classic-ideas/ai-blob-storage-size-should-be-limited-and-au...
Find more inspiration, events, and resources on the new Adobe Community
Explore Now