Exit
  • Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
  • 한국 커뮤니티
6

P: Automatically Write Changes To XMP can't be used with Denoise on large numbers of photos

LEGEND ,
Jun 28, 2025 Jun 28, 2025

The new implementation of Denoise in LR 14.4 has several bugs with Automatically Write Changes To XMP. As a result, it is not practical to use that option with large numbers of denoised photos.

 

- The Denoise data gets copied into the main .lrcat database file, making it two orders of magnitude larger and defeating the purpose of separating out computed AI data into the .lrcat-data file. Surely not intended by the original architects of the .lrcat-data file.

 

- Whenever LR is saving XMP for a large batch of denoised photos, e.g. after the initial application of Denoise or after applying a batch change to other Develop sliders, it isn't possible to switch photos in Develop until the saving completes, which can be minutes for large batches.

 

- After saving XMP has completed, spurious "Saving XMP for n photos" messages appear in the upper-left corner, even though the .xmp sidecars aren't actually being modified. The incorrect message itself is innocuous, but it likely is a symptom of more serious internal bugs.

 

To reproduce on LR 14.4 / Mac OS 15.5:

 

1. Download and open this LR catalog containing 100 raws to which Denoise has already been applied (beware, it's 2.8 GB):

https://www.dropbox.com/scl/fi/61bsi0xh7xdgpewnjt20u/denoise-xmp.2025-06-28.zip?rlkey=ipwzy7azjnhzoj...

 

Observe the sizes of the various files/subfolders:

 

denoise-xmp.lrcat: 4.1 MB

denoise-xmp.lrcat-data: 728 MB

pics (containing the raws): 2.1 GB

 

2. Open the first photo in Develop and make the filmstrip visible.

 

3. Set the option Catalog Settings > Metadata > Automatically Write Changes To XMP. Observe in the upper-left corner the message "Saving XMP for n photos".

 

4. Click on random photos in the filmstrip and observe that they won't appear Develop Loupe view until the saving of XMP finishes (incorrect). (See the attached screen recording "saving.mp4".)

 

5. After the metadata is fully saved to all the photos, observe these sizes:

 

denoise-xmp.lrcat: 744 MB (incorrect)

denoise-xmp.lrcat-data: 728 MB (correct)

pics (containing the raws): 3.0 GB (correct)

 

Saving metadata has copied the Denoise data into the .xmp sidecars in the "pics" folder (correct), but it has also copied into the main .lrcat catalog database (incorrect).

 

6. Open Finder on the "pics" subfolder and observe there are 100 .xmp sidecars, all larger than 6 MB, indicating that the metadata has been saved for all the photos (correct).

 

7. Sort the Finder window by Date Modified.

 

8. In LR Develop, select the first photo. Then use the right-arrow key to move to the next photo, about once per second.  Observe that "Saving XMP for n photos" appears in the upper-left, and is increasing as you proceed through the filmstrip.   

 

9. Stop using the arrow key and wait until "Saving XMP" disappears. Observe in the Finder window that no .xmp sidecars have been modified. (See the attached screen recording "navigating.mp4".)

 

10. Select the first photo in the film strip and set Exposure = -3.  With all the photos selected, do Sync Settings and sync just Exposure.

 

11. Click random photos in the filmstrip and observe the same symptoms as in step 4 -- after the saving of XMP gets started, the photos won't appear in Develop Loupe until the saving finishes. (See the attached screen recording "exposure.mp4".)

 

 

 

Bug Unresolved
TOPICS
macOS , Windows
1.6K
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
10 Comments
Engaged ,
Jun 28, 2025 Jun 28, 2025

John, thanks for your very thorough testing and explanation.  Two questions I have:

1.  It seems to me the type of problem you describe should have been caught by any reasonable product testing program.  Do you have any idea why Adobe misses stuff like this?

2. I do not use automatic writing to XMP, but I see lots of reports (most seem ill informed) of major issues with the new denoise method.  I have used the DNG based denose a lot, and have done a few with the new process.  Do you have any suggestions on how the new process should be used and integrated into our work flow?  Should we revert to prior version to avoid issues?

Thanks again for your work on LrC.  

 

Translate
Report
Community Expert ,
Jun 28, 2025 Jun 28, 2025

Are you sure the xmp files don't get actually updated when you are stepping through? Finder can be extraordinarily sluggish with updating the metadata columns. It certainly seems like Lightroom is saving something seeing how long it takes to count down the number of saves left. Often you have to close and reopen a folder to see any updates. 

Anyway, these are clearly bugs with the automatic xmp writing system. That has always been very buggy unfortunately.  

Translate
Report
LEGEND ,
Jun 28, 2025 Jun 28, 2025

@Jao vdL: "Are you sure the xmp files don't get actually updated when you are stepping through? Finder can be extraordinarily sluggish with updating the metadata columns. It certainly seems like Lightroom is saving something seeing how long it takes to count down the number of saves left. Often you have to close and reopen a folder to see any updates."

 

During my testing I originally thought LR was actually writing the .xmp sidecars, so I quintuple-checked:

 

1. I waited for several minutes to see if Finder would update.

 

2. I switched the Finder window to another folder, waited, then back.

 

3. I quit Finder and restarted it.

 

4. When I selected a single photo and did Metadata > Save Metadata To File, Finder sorted the changed .xmp to the top of the list immediately.

 

5. I captured copies of the initial .xmp files and then diffed them with the current .xmp files, and there were no differences.

Translate
Report
LEGEND ,
Jun 28, 2025 Jun 28, 2025

"Do you have any suggestions on how the new process should be used and integrated into our work flow?"

 

Turn off Automatically Write Changes Into XMP. (I've done that, thought I've always had the option on.)

 

If you're working with large batches of noisy photos (e.g. a photo shoot of a stage production or a sporting event, with high ISOs), rearrange your schedule to the degree possible to apply Denoise in batch immediately after import and find something else to do while it's running.   That's a minor inconvenience for me (e.g. when I shoot our scout troop's court of honor), but for high-volume professional photographers, that could still be costly in terms of their time, especially if they're on deadline.

 

"It seems to me the type of problem you describe should have been caught by any reasonable product testing program.  Do you have any idea why Adobe misses stuff like this?"

 

Adobe has long had a reputation for reducing their development costs to the absolute minimum needed to maintain their market dominance. For products like LR Classic that don't have serious competition (at least for the asset-management features), users don't have anywhere else to go, which means Adobe can further reduce development costs. Wall Street has long loved Adobe for its quarterly performance.

 

On the demand side, with many software categories, users show a preference for buying the cheaper product, even if it's lower quality. I'd be willing to pay twice as much for a higher-quality LR, but many users wouldn't.

Translate
Report
Advocate ,
Jul 01, 2025 Jul 01, 2025

As I reported in another thread the best workflow is to use VC and not save into XMP till the last moment.

e.g

.lrcat = 1,9 MB

Super Resolution applied to a VC of the ONLY image in the catalog (a GFX 100 RAW)

lrcat = 1,9 MB

Save the VC as a named "Super Res" Snapshots...Enhance now is to be saved into XMP

Save into XMP (this can take many many seconds,)

.lrcat = 203.4 MB

Now get rid of the Snapshots
Save into XMP

lrcat = 203.4MB

Optimise Catalog
lrcat = 1.9 MB

So use VC and save XMP only when you desperately need the edits in the sidecar.


Translate
Report
Enthusiast ,
Aug 25, 2025 Aug 25, 2025

In my opinion it's just a total mess in how Adobe handles sidecars since introduction of AI features.

When editing in ACR it saves metadata to *.xmp and binary data to *.acr files.

Then when importing, LrC ignores *.acr files completely and AI masks needs to be regenerated (I'm not on the latest version so can't test in it, but have doubts anything changed) - why is that?

In LrC, saving metadata to sidecars saves everything including binary data into *.xmp - why is that, why not *.acr?

And in my opinion putting this into order and making behaviour consistent across applications (I would say ACR behaviour is preferred with having binary data separated) would solve a lot of related issues, probably including this one.

Translate
Report
Advocate ,
Aug 25, 2025 Aug 25, 2025

@FSt0p

 

Upon import LrC reads the .acr sidecar.

 

 

E.g.

In ACR apply Denoise and an Ai Mask or an Ai Remove or Distraction Removal 

2. Import in LrC 

 

Actual Result : no Ai needs Update.

 

If following these stepsyou need to update Ai upon import then you have a bug and should report it.

 

.

Translate
Report
Enthusiast ,
Aug 25, 2025 Aug 25, 2025

As I've said - I'm on LrC 12, so maybe this has been changed.

Anyways there is nothing to report as Adobe only accepts reports for latest version.

 

But the point that LrC does not create *.acr files for binary data when saving sidecar files still stands.

Why is behaviour different compared to ACR?

 

This leads to issue described in this thread.

Because reading *.xmp files back then brings all that binary data (base64 encoded) from xmp sidecar into main catalog, and not in lrcat-data as it should.

Translate
Report
LEGEND ,
6 hours ago 6 hours ago

The road to software hell is paved with little expediencies that seemed harmless to the developers at the time.  For the current mess, the expedient paving stones are:

 

- Using the .lrcat-data file as both a store for the develop parameters and as a discardable cache of develop results for enhancing performance.

 

- Storing both develop parameters and cached results in .xmp sidecars, rather than separating them into .xmp and .acr sidecars as Camera Raw does.

 

- When LR saves metadata to disk, it first copies settings and cached results from .lrcat-data into the .lrcat file.

 

- LR stores no random seed for the random algorithms of Generative AI Remove, so its generated replacement patches must be considered as develop settings rather than as discardable computed results that are cached for performance.

 

I think all of these expedient paving stones have resulted in unncessary architectural compications making it harder for the developers to think clearly about the proper design and implementation of Denoise and other expensive AI commands.

 

DETAILED HISTORY

 

  1. Originally, all develop settings were stored only in the SQL database (.lrcat file). The stored settings were all "parametric", describing the function to be applied to the image (e.g. slider settings or brush strokes), rather than containing the resulting pixels of the function.

 

  1. LR 11.0 introduced the computed AI masks Subject and Sky.  It also introduced the .lrcat-data file for storing the computed pixels of these masks as well as LUTs (color lookup tables) from imported enhanced profiles.

 

The pre-version-11 code for saving metadata to disk found all the develop settings in the .lrcat file. But version 11 stored the subject/sky mask pixels as well as profile LUTs in the .lrcat file, with the express purpose of making the .lrcat file smaller and making it faster for the Camera Raw engine to access these relatively large binary objects ("blobs").

 

Rather than changing the save-metadata-to-disk code to read those settings and results from the .lrcat-file, the developers found it more expedient to copy them from the .lrcat-data file into the .lrcat file before invoking the existing save-metadata-to-disk code. This expediency didn't create huge issues, though, since the pixel masks and LUTs compress well and are relatively small.

 

Another architectural confusion introduced in LR 11.0: The .lrcat-data file and .xmp sidecars are used both as a cache for storing the computed results of the sky and subject masks and as a store of the develop parameters of enhanced-profile LUTs. 

 

The pixels of the sky and subject masks are not develop parameters -- rather, they're computed from the parameters (the presence of the masks in the Masking panel). You can throw away the computed pixel masks and LR will recompute the masks, producing the same exact pixels.  This is exactly how all the previously introduced Develop settings work.

 

Thus, you can throw away the .lrcat-data file and .xmp sidecars and the sky and subject masks will produce the same exact results.  For these settings. the .lrcat-data file is simply a cache for accelerating performance.

 

But for the LUTs of enhanced profiles, the .lrcat-data is the store of the "truth", in the same way the .lrcat file stores the truth of all the other develop settings.  If you throw away the .lrcat-data file and the enhanced profiles' .xmp files, LR isn't able to recompute the effects of the enhanced profiles because the LUTs are missing (all the other parameters of the enhanced profiles are stored in the .lrcat file). This is unlike develop presets, where you can throw away the presets' .xmp files and LR will still be able to render cataloged images, because the presets' settings are stored in the .lrcat file.

 

  1. LR 14.0 introduced Generative AI Remove. One might think that the parameters of AI Remove are simply the brush strokes applied by the user and the index number of the chosen variant, both of which are small and can be stored in the .lrcat file without huge impact (as the strokes of the brush mask are stored). 

 

However, Generative AI Remove is not mathematically idempotent, like all other develop settings. That is, if you reapply the exact same brush strokes to an image, the Adobe Firefly server will return different results each time. The Firefly AI algorithms have randomness built in to them, as many image-processing algorithms do. In a better implementation, that randomness would have been controlled by a "seed" number provided by clients like LR, so they could recompute the exact same results later; but that's not how Firefly is implemented, or if it is, LR isn't using the seeds, another design flaw.

 

Thus, the computed replacement patches of AI Remove must be considered as parameters necessary to render the image the same way each time, not as cached results. And since those computed patches are stored in the .lrcat-data file and .xmp sidecars, the files serve as the only store for Remove parameters, rather than as a performance cache that can be discarded.

 

Users of the Imagen AI service discovered this the hard way when Imagen AI corrupted their .lrcat-data files. The only fix was to discard the corrupted .lrcat-data files, resulting in different Generative AI Remove results for existing cataloged photos. (This was primarily Imagen AI's fault for writing their results directly into the LR catalog rather than using the SDK APIs -- shame on them.)

 

  1. And then LR 14.4 introduced the new implementation of Denoise, in which the computed whole-image results are stored in the .lrcat-data file. These results are huge, typically 5 to 50 MB per image, depending on the degree of detail in the image, two to three orders of magnitude larger than all the other develop settings and computed results.

 

But the Denoise results are not develop parameters -- they can be discarded and Denoise will recompute the same exact results. The only parameters are the checkbox and the slider. But because it takes so long to compute Denoise, it's crucial that its results get cached for performance.

 

Up until Denoise, it didn't matter much that saving metadata to disk copied develop settings from .lrcat-data into .lrcat before writing them to the .xmp sidecars.  The needlessly copied settings were relatively small. But the huge size of Denoise compiled with saving metadata to disk bloats the .lrcat file, negating the original purpose of the .lrcat-data file.

 

And if LR reserved .xmp sidecars for parameters and .acr sidecars for computed results, as Camera Raw does, the LR developers might have thought more clearly about the architectural principles sooner and implemented saving metadata to disk properly.

Translate
Report
Advocate ,
5 hours ago 5 hours ago
LATEST

And just to note, the backend problems are made worse by both poor UI choices and by outright bugs in both ACR and Lightroom.

Translate
Report