Skip to main content
johnrellis
Legend
June 28, 2025

P: Automatically Write Changes To XMP can't be used with Denoise on large numbers of photos

  • June 28, 2025
  • 27 replies
  • 3644 views

The new implementation of Denoise in LR 14.4 has several bugs with Automatically Write Changes To XMP. As a result, it is not practical to use that option with large numbers of denoised photos.

 

- The Denoise data gets copied into the main .lrcat database file, making it two orders of magnitude larger and defeating the purpose of separating out computed AI data into the .lrcat-data file. Surely not intended by the original architects of the .lrcat-data file.

 

- Whenever LR is saving XMP for a large batch of denoised photos, e.g. after the initial application of Denoise or after applying a batch change to other Develop sliders, it isn't possible to switch photos in Develop until the saving completes, which can be minutes for large batches.

 

- After saving XMP has completed, spurious "Saving XMP for n photos" messages appear in the upper-left corner, even though the .xmp sidecars aren't actually being modified. The incorrect message itself is innocuous, but it likely is a symptom of more serious internal bugs.

 

To reproduce on LR 14.4 / Mac OS 15.5:

 

1. Download and open this LR catalog containing 100 raws to which Denoise has already been applied (beware, it's 2.8 GB):

https://www.dropbox.com/scl/fi/61bsi0xh7xdgpewnjt20u/denoise-xmp.2025-06-28.zip?rlkey=ipwzy7azjnhzoj6ctooz3r1r5&dl=0

 

Observe the sizes of the various files/subfolders:

 

denoise-xmp.lrcat: 4.1 MB

denoise-xmp.lrcat-data: 728 MB

pics (containing the raws): 2.1 GB

 

2. Open the first photo in Develop and make the filmstrip visible.

 

3. Set the option Catalog Settings > Metadata > Automatically Write Changes To XMP. Observe in the upper-left corner the message "Saving XMP for n photos".

 

4. Click on random photos in the filmstrip and observe that they won't appear Develop Loupe view until the saving of XMP finishes (incorrect). (See the attached screen recording "saving.mp4".)

 

5. After the metadata is fully saved to all the photos, observe these sizes:

 

denoise-xmp.lrcat: 744 MB (incorrect)

denoise-xmp.lrcat-data: 728 MB (correct)

pics (containing the raws): 3.0 GB (correct)

 

Saving metadata has copied the Denoise data into the .xmp sidecars in the "pics" folder (correct), but it has also copied into the main .lrcat catalog database (incorrect).

 

6. Open Finder on the "pics" subfolder and observe there are 100 .xmp sidecars, all larger than 6 MB, indicating that the metadata has been saved for all the photos (correct).

 

7. Sort the Finder window by Date Modified.

 

8. In LR Develop, select the first photo. Then use the right-arrow key to move to the next photo, about once per second.  Observe that "Saving XMP for n photos" appears in the upper-left, and is increasing as you proceed through the filmstrip.   

 

9. Stop using the arrow key and wait until "Saving XMP" disappears. Observe in the Finder window that no .xmp sidecars have been modified. (See the attached screen recording "navigating.mp4".)

 

10. Select the first photo in the film strip and set Exposure = -3.  With all the photos selected, do Sync Settings and sync just Exposure.

 

11. Click random photos in the filmstrip and observe the same symptoms as in step 4 -- after the saving of XMP gets started, the photos won't appear in Develop Loupe until the saving finishes. (See the attached screen recording "exposure.mp4".)

 

 

 

27 replies

Participating Frequently
November 7, 2025

My question to the author of this thread regarding the issue got a bit lost in a nice off-topic discussion... sorry 🙂

DdeGannes
Community Expert
Community Expert
November 4, 2025

@_wrzl_ , the Author of this bug thread which refers to "Automatically write Metadata to XMP" Johnrellis, indicated above that the issue raised has been rectified and stated

" It's now safe again to enable Catalog Settings > Metadata > Automatically Write Changes to XMP. 
(We can conduct the religious arguments about whether to do that in another thread.)"

Thanks for the info John.

 

 

 

Regards, Denis: iMac 27” mid-2015, macOS 11.7.10 Big Sur; 2TB SSD, 24 GB Ram, GPU 2 GB; LrC 12.5,; Lr 6.5, PS 24.7,; ACR 15.5,; (also Laptop Win 11, ver 24H2, LrC 15.0.1, PS 27.0; ) Camera Oly OM-D E-M1.
Participating Frequently
November 4, 2025

Interestingly the "needs ai processing" state or "missing AI cases" you're describing exist already. I was thinking of adopting denoising generally and have all images be denoised in advance so I can later quickly decide if I need any denoising and if so at which strength without having to wait.

So I denoised a whole bunch of images by copying the ai denoise settings to them all. After some waiting I then decided to first do manual editing instead and therefore cancelled the batch-ai-proccessing, yet the markers for the individual pictures were already set to denoising from the copy settings process. Now there was no way (that I could find at least) to show in the library which of the images were actually still in need of processing with the ai pipeline and which were done already.

If you open a single image in the editor you can see a little flag / circle arrow at the top right where all the ai-processing you applied to the picture is listed and it shows there if you need to process it still (or again) or not, but you have to open each image to see if that is the case, which is far from ideal. You can filter the images in the catalog by ai denoising being applied to them, but you can't see if they are already done processing or just marked for it.

If it were easy to filter for "these images need to have their ai processing applied still", you could simply mark them for denoising at the desired level and do this later - without taking up any storage space (for denoising) in the meantime. This would also fit in with my idea of removing the processed denoise data and being back to the "needs ai processing" state, which you can currently create already anyway f.e. by cancelling the active denoise processing - only this currently leads to rather poorly managable "intermediate states" you cannot filter out and simply process later if necessary.

C.Cella
Legend
November 2, 2025

@_wrzl_

 

Deleting Ai iamges in lrcat-data while keeping the Denoise Filter in catalog means creating essentially missing Ai case(s)

 

As of now not only users will not know which files have Denoise missing since LrC doens't allow to find them (I can find them with my own codes) but users will have to waste time to regenerate the data. 

 

Any user can alredy manage lrcat-data by removing Denoise from the photo.

Any user can take the Denoised files > remove the Denoise > put them in a  "To Denoise again" collection .

This doens't leave any missing Ai, this is safe. 

 

• Now Adobe recommends to follow an order of edits.

Enahce filter is the 2nd edit we supposedly should be doing.

 

I don't follow that rule and aplly Enahce last.

The main advantage of this is that I have no/less performance problems when doing masking or any other edit AND also my lrcat-data stays light till the very last moment Enhance is needed. 

 

Enahnce data for me exists only the time I do the final touches and export.

 

So IMO the simplest thing is to adapt the workflow and contrary to what Adobe suggests do Enahce last.

This ensures best performance and compact lrcat-data and evne smaller .acr sidecars.

.

Participating Frequently
November 2, 2025

I agree that not all ai data in lrcat-data should be trashed (because the current implementation is lacking). That's why I am talking specifically about the ai denoise data, since this is pretty much the biggest space requirement factor and it can be restored 1:1 by storing a simple number from 1 to 100. John explained this beautifully in his post.

C.Cella
Legend
November 2, 2025

@_wrzl_ 

lrcat-data is not only storing data that can be regenerated identical every time.

 

Generative Remove once trashed are lost and LrC can't regenerate them alike. 

 

So it would be a very bad idea to have a mechanism that clears the lrcat-data using "older than X days" logic like the one in place for trahsing previews.

 

 

 

Participating Frequently
November 2, 2025

Currently the processed ai data is increasing the size of the catalog backup massively over time and thereby the processing time for every backup.

Yes, you usually don't want to reprocess the denoise data, but since the full denoise data can be recreated 1:1 without storing the processed result, there could be an option to decide if you even want to store this inside the catalog - especially if you're writing the data into xmp files, which you then could store with the original files instead of within the catalog. So if a file is opened in LR, LR checks if there is ai data in a .xmp file, if not then if it is in the lrcat-data, and only if none of these are present, it asks to recreate the ai data. 

For previews you can decide how long (or if) something is stored, but the ai denoise data is stored forever - the only difference being the time it takes to create a preview as in contrast to denoise data.

C.Cella
Legend
November 2, 2025

@_wrzl_

Why would you want an option to NOT store the data in lrcat-data ?

 

In what case is desirable to always have to recompute the Ai at every session?

Participating Frequently
November 2, 2025

As far as I understood this only concerns the actual .lrcat file that might have been bloated due to the bug when using xmp storage? There is still no way to decide to not store the ai denoise data that currently is stored in the lrcat-data folder - which in a sense is just a cache option to not have to redo the processing?

Stephan Preining
Participant
October 29, 2025

Works perfectly now! Thanks a lot!