Skip to main content
Participating Frequently
May 9, 2023
Question

Lightroom Denoise adding random blocks and artifacts to images

  • May 9, 2023
  • 18 replies
  • 19785 views

Can anyone help me or shed some light why Lightroom Classic is adding these weird blocks and artifacts to my pictures once run through Denoise. There were no edits applied prior to denoise. These are on other parts of the photo as well.

Lightoom is upto date along with all drivers. My system shouldn't be the issue but here's the details
DELL XPS 17
NVIDIA GeForce RTX 3060 Laptop GPU
Driver 531.79
11th Gen Intel(R) Core(TM) i7-11800H @ 2.30GHz
32 GB RAM

Plenty of storage space

 

I only have very basic knowledge of Lightroom so it might be an obvious setting somewhere, but if someone can help I'd be very grateful.

Images are .arw files

This topic has been closed for replies.

18 replies

New Participant
December 30, 2024

I had the same issue. I went through a massive ordeal with Asus. They wouldnt fix the issue properly. After months downloading new gpu drivers, going back to old gpu drivers, reinstalling every driver from the ASUS website, applying new thermal paste twice, taking step by step videos of realtime artifacting from blender renders / lightroom crashing / photoshop camera raw crashing / adjust lightroom settings / reseting software preferences / PERFORMING A CLEAN WINDOWS 10 pro INSTALL / adjusting color profiles / uploading numerous photos with proof to ASUS support, sending the laptop to the repair center for a false diagnosis(no help), applying new thermal paste a third time.....nothing worked. The GPU or CPU overheated while rendering 3D files for an animation. The laptop works but i cannot properly produce graphics content without worrying about crashing or destroyed images. The laptop advertises as high heat tolerant. Nope!

 

Asus W73G5T XH99

Nvidia Quadro RTX 5000 16GB

Intel Xeon E-2276M @ 2.80GHZ up to 4.7ghz

64GB RAM

4tb ssd

nealethesnapper
New Participant
November 13, 2024

Not a fix as such but may help - I have found that the problem is less troublesome if I denoise in batches.

I may be have a 5% failure rate on a batch of say 200 images [I shoot sports in low light] 

Applying denoise to single images has a to date 100% failure rate.

This has been the only fix that has somewhat worked for me.

Thus far I have , Upgraded OS [Sonoma 14.6.1. 

Turned off the graphics card for LR 

Stood on one leg with my arm in the air

Hopped on one foot and howled at the moon.

Burnt sage.

I don't think anything works tbh and Adobe doesn't seem to care 

My new Imac arrives next week with an M4 chip so hopefully the issue will have been addressed

johnrellis
Brainiac
November 13, 2024

@nealethesnapper: "Upgraded OS [Sonoma 14.6.1. Turned off the graphics card for LR"

 

Many others have reported that versions of Mac OS from 14.4.1 through the last version of 14 didn't change the misbehavior. But there have been a fair number of reports that Mac OS 15 solves the problems with the AMD 5000-series GPUs on older Intel Macs (and a couple of reports that it doesn't).

 

Setting Preferences > Performance > Use Graphics Processor to Off worked around the display and editing artifacts many people saw on those GPUs with Mac OS 14. But Adobe in its wisdom refuses to have the AI commands (masking, denoise, lens blur) obey the settings of Use Graphics Processor. So that won't help you.

Participating Frequently
November 12, 2024

I am having the same artifact issue on an iMac. It is not consistent - sometimes it occurs, and sometimes it doesn't. Based on completely subjective observations, it appears to generate more artifacts as the noise reduction slider is increased. Restarting LR or the computer does not resolve the problem.

3.8 GHz 8-Core Intel Core i7

AMD Radeon Pro 5500 XT 8 GB

32 GB 2667 MHz DDR4

macOS 14.6.1 (23G93)

Any help in resolving this would be greatly appreciated.

 

johnrellis
Brainiac
November 12, 2024

"3.8 GHz 8-Core Intel Core i7

AMD Radeon Pro 5500 XT 8 GB

macOS 14.6.1 (23G93)

 

Restarting LR or the computer does not resolve the problem."

 

Apple released a buggy graphics driver for the AMD 5000-series GPUs in Mac OS 14.4.1.  This causes display artifacts in normal editing and export and for some like you, problems with the AI commands. 

 

Unfortunately, setting Preferences > Performance > Use Graphics Processor to Off helps with the display and editing artifacts but not with the AI commands, since Adobe in its wisdom refuses to provide an option to disable their use of the GPU.

 

You could try updating to Mac OS 15.  Some have reported this fixes issues with the 5000-series GPUs, while there a couple reporting it doesn't.

 

New Participant
May 28, 2024

I’m having an unusual problem with Denoise in Lightroom Classic 13.3.  It worked fine for me in earlier versions.  I’m using an Apple iMac with the latest Apple operating system.  The problem is that Denoise eliminates the noise in my photos, but it adds large square artifacts, some of solid colors, some filled with gibberish.  This happens pretty consistently and makes it almost impossible for me to use Denoise.  I would be happy to provide samples of the problem if that would help.  I welcome suggestions.

Brainiac
May 28, 2024

If you have an Intel iMac with AMD graphics, this is a known issue. There are also problems on Windows.

https://community.adobe.com/t5/lightroom-classic-discussions/lightroom-denoise-adding-random-blocks-and-artifacts-to-images/td-p/13780661

New Participant
May 19, 2024

Okay, this is a problem.  Why isn't Adobe putting out information as to how to resolve it?  Like everyone else, I used denoise and got white blocks!  So irritating that there's no explanation for us.

johnrellis
Brainiac
May 20, 2024

@sandyd40372345, please do the LR menu command Help > System Info and copy/paste the entire contents here so we can see exactly which versions of hardware and software LR thinks you're running and important LR options that are set.

New Participant
May 28, 2024

I am getting pink blocks! The following is my System info.

Lightroom Classic version: 13.3 [ 202405092057-40441e28 ]
License: Creative Cloud
Language setting: en-US
Operating system: Mac OS 14
Version: 14.5.0 [23F79]
Application architecture: x64
Logical processor count: 16
Processor speed: 3.8GHz
SqLite Version: 3.36.0
Built-in memory: 8,192.0 MB
Dedicated GPU memory used by Lightroom: 5,212.9MB / 8,176.0MB (63%)
Real memory available to Lightroom: 8,192.0 MB
Real memory used by Lightroom: 692.7 MB (8.4%)
Virtual memory used by Lightroom: 44,524.7 MB
Memory cache size: 311.8MB
Internal Camera Raw version: 16.3 [ 1863 ]
Maximum thread count used by Camera Raw: 5
Camera Raw SIMD optimization: SSE2,AVX,AVX2
Camera Raw virtual memory: 1388MB / 4095MB (33%)
Camera Raw real memory: 1418MB / 8192MB (17%)

Cache1:
Final1- RAM:352.0MB, VRAM:0.0MB, _I0A0753.CR3
Final2- RAM:355.0MB, VRAM:48.0MB, _I0A0753-Enhanced-NR-3.dng
Preview3- RAM:22.0MB, VRAM:0.0MB, _I0A0753-Enhanced-NR.dng
Final4- RAM:352.0MB, VRAM:0.0MB, _I0A0753-Enhanced-NR-2.dng
Preview5- RAM:22.0MB, VRAM:0.0MB, _I0A0751.CR3
Preview6- RAM:22.0MB, VRAM:0.0MB, _I0A0752.CR3
NT- RAM:1,125.0MB, VRAM:48.0MB, Combined:1,173.0MB

Cache2:
m:311.8MB, n:1,123.9MB

U-main: 76.0MB

Standard Preview Size: 5120 pixels
Displays: 1) 5120x2880

Graphics Processor Info:
Metal: AMD Radeon Pro 5500 XT
Init State: GPU for Export supported by default
User Preference: Auto

Application folder: /Applications/Adobe Lightroom Classic
Library Path: /Users/sharonfazio/Pictures/Lightroom/Lightroom Catalog-v13-3.lrcat
Settings Folder: /Users/sharonfazio/Library/Application Support/Adobe/Lightroom

Installed Plugins:
1) AdobeStock
2) Aperture/iPhoto Importer Plug-in
3) DxO PureRAW 3
4) DxO PureRAW 3 Importer
5) Flickr
6) Luminar Neo
7) Nikon Tether Plugin
😎 Topaz Photo AI

Config.lua flags:

 

 

robcatphoto
New Participant
May 3, 2024

My turn... I'll need help if its a AMD Radeon 5500 update to firmware...  Any and all help appreciated.

 

Reprot info here:  seeing these blocks today...

 

Lightroom Classic version: 13.2 [ 202402141005-bf1aeb84 ]
License: Creative Cloud
Language setting: en-US
Operating system: Mac OS 14
Version: 14.4.1 [23E224]
Application architecture: x64
Logical processor count: 16
Processor speed: 3.8GHz
SqLite Version: 3.36.0
Built-in memory: 40,960.0 MB
Dedicated GPU memory used by Lightroom: 7,461.2MB / 8,176.0MB (91%)
Real memory available to Lightroom: 40,960.0 MB
Real memory used by Lightroom: 8,438.3 MB (20.6%)
Virtual memory used by Lightroom: 56,301.4 MB
Memory cache size: 291.3MB
Internal Camera Raw version: 16.2 [ 1763 ]
Maximum thread count used by Camera Raw: 5
Camera Raw SIMD optimization: SSE2,AVX,AVX2
Camera Raw virtual memory: 1528MB / 20479MB (7%)
Camera Raw real memory: 1541MB / 40960MB (3%)

Cache1:
Preview1- RAM:22.0MB, VRAM:0.0MB, _ATR4800.CR2
Preview2- RAM:22.0MB, VRAM:0.0MB, _ATR4801.CR2
Preview3- RAM:22.0MB, VRAM:0.0MB, _ATR4797-Enhanced-NR.dng
NT- RAM:66.0MB, VRAM:0.0MB, Combined:66.0MB

Cache2:
final1- RAM:248.0MB, VRAM:0.0MB, _ATR4801.CR2
final2- RAM:257.0MB, VRAM:270.0MB, _ATR4800.CR2
final3- RAM:248.0MB, VRAM:0.0MB, _ATR4797-Enhanced-NR.dng
T- RAM:753.0MB, VRAM:270.0MB, Combined:1,023.0MB

Cache3:
m:291.3MB, n:112.8MB

U-main: 100.0MB

Standard Preview Size: 2048 pixels
Displays: 1) 5120x2880

Graphics Processor Info:
Metal: AMD Radeon Pro 5500 XT
Init State: GPU for Export supported by default
User Preference: Auto

Application folder: /Applications/Adobe Lightroom Classic
Library Path: /Users/robertcatania/Pictures/Lightroom/Lightroom Catalog-v13.lrcat
Settings Folder: /Users/robertcatania/Library/Application Support/Adobe/Lightroom

Installed Plugins:
1) AdobeStock
2) Aperture/iPhoto Importer Plug-in
3) Flickr
4) Nikon Tether Plugin

Config.lua flags: None

 

 

johnrellis
Brainiac
May 3, 2024

@Carliewheeler

OS Version: macOS 14.4.1 (23E224)
Graphics Processor Info: Metal: AMD Radeon Pro 5500 XT - 8 GB

 

@robcatphoto:

Operating system: Mac OS 14
Version: 14.4.1 [23E224]
Metal: AMD Radeon Pro 5500 XT

It appears that Mac OS 14.4.1 includes a buggy graphics driver for the AMD 5300 and 5500 series graphics processors. Many people have reported other symptoms with that combination:

https://community.adobe.com/t5/lightroom-classic-bugs/p-gpu-develop-edit-view-amp-export-artifacts-after-14-4-1-update-amd-only-affects-cr-lrc-amp-lrd/idi-p/14563606

 

But the same bad driver might be causing your symptoms as well. 

 

One workaround for those other users is restarting LR. Does that (at least temporarily) cause Denoise to run correctly?

 

Another workaround for those users is to set Preferences > Performance > Use Graphics Processor to Off. Unfortunately, that won't help you, because Adobe inexplicably refuses to allow users to disable LR's use of the GPU for AI commands like Denoise.

New Participant
April 8, 2024

Mine has just started doing the same thing today 

 

did you end up being able to fix yours? 

johnrellis
Brainiac
April 8, 2024

@Carliewheeler, there are a number of different possible causes, most of which are fixable. To troubleshoot:

 

1. Post a full-resolution screenshot (not a phone pic) of what the entire photo with artifacts looks like in Develop.

 

2. What's the precise camera model?

 

3. Do the LR menu command Help > System Info and copy/paste the entire contents here so we can see exactly which versions of hardware and software LR thinks you're running and important LR options that are set.

New Participant
April 8, 2024

Thank you

its been working perfectly until today and i havent changed anthing other than updates  ( i think my creative cloud updated overnight)

I am using a  EOS R6 CANON

 

Lightroom version: 7.2 x64 [ 20240214-0910-27200e9 ] (Feb 14 2024)
NGL Version: 1.36.0.9
WF Version: 6.2 3d4e7aa
VF Version: 1.0.135.7
HIL Version: 40409
CAI Version: adobe_c2pa/0.7.6 c2pa-rs/0.25.2

Operating system: macOS
OS Version: macOS 14.4.1 (23E224)
Application architecture: x64
Computer model: iMac20,1 / Intel(R) Core(TM) i7-10700K CPU @ 3.80GHz
Logical processor count: 16
Processor speed: 3.8 GHz
Built-in memory: 8,192.0 MB
Real memory available to Lightroom: 8,192.0 MB
Real memory used by Lightroom: 724.0 MB (8.8%)
Peak memory used by Lightroom: 1,613.0 MB
Memory cache size: 942.7 MB

Internal Camera Raw version: 16.2 [ 1763 ]
Maximum thread count used by Camera Raw: 9
Camera Raw SIMD optimization: SSE2,AVX,AVX2
Camera Raw virtual memory: 639MB / 4095MB (15%)

Display: 5120x2880

Graphics Processor Info: Metal: AMD Radeon Pro 5500 XT - 8 GB
Graphics Processor Detail: loaded: Yes, supported: Yes, compute: Yes, init: I4_GPU4, hard: success, soft: success, al: Yes, dl: No
OS Media Capability: true

Application Folder: /Applications/Adobe Lightroom CC
Settings Folder: /Users/carliewheeler/Library/Application Support/Adobe/Lightroom CC
Library Folder: /Users/carliewheeler/Pictures/Lightroom Library.lrlibrary

 

 

New Participant
April 1, 2024

Don't know whether you ever resolved this issue, but for posterity sake...I had this same issue (in addition, if I tried to denoise large batches of images, it would only make it through a dozen or so before I got an error message related to the gpu). What I ended up doing was changing a setting in preferences. So, I went to "edit," (this is on a pc, not a mac), "preferences," chose the "performance" tab, and then where it says, "use graphics processor:" I switched it from "auto" to "custom" and then made sure that all of the boxes were checked. This seems to (so far) have solved the issue for me. I just ran a batch of 45 images, and then another batch of 98 images to test it out, and not only did I not have any of the issues that I previously had, but it also completed the task somewhat faster than it was doing previously. My hypothesis about this is that when it is set to auto, part of the workflow for the system is to make a decision for each image as to whether to use the cpu or gpu, and sometimes that leads to these errors as it is trying to make this decision while the image is being processed. Therefore, if you take the decision out of the equation, and just make your settings so that the gpu is being used, it solves the issue. I don't know if that is correct, but so far it seems to have worked.

New Participant
April 1, 2024

Also, to clarify, my machine is a lenovo legion 5i with a Nvidia GeForce 4070. The issue with the image artifacts was that there would be large squares that appeared in the finished image or very small white spots. In addition, as stated, if I did a large batch, the process would terminate part way through indicating a gpu error.

Participating Frequently
July 14, 2023

I'm still interested to know if only Dell XPS 17 9710 and Nvidia GeForce RTX 3060 laptop GPUs are affected by the problem or if other systems/graphics cards are affected as well. Therefore, it would be nice if someone with other computers and other graphics cards having the same problems would report here.

Casga_Author
Participating Frequently
July 15, 2023

I'm guessing this is going to be just a dell 9710, rtx 3060 issue. 

I've searched for quite a while and can't find any thing like this anywhere else. 

I just had a BSOD while running denoise and the error was video tdr failure. I can't remember seeing this error before. Maybe I have, but it's been a while and I've seen a lot of crashes and lockups so might've forgotten. 

 

But does this error mean that the gpu is being over worked/ over heated? Surely running denoise shouldn't be an issue for this system. 

Ian Lyons
Adobe Expert
July 15, 2023

In the past, companies such as Dell have used graphics cards that use vendor specific drivers. Not having used Windows in more than 2 decades, I've no idea if that's still the case, but still worth checking the Dell support site.

New Participant
July 14, 2023

This was image was created on a new build AMD Ryzen 7 5800x (8 cores/16 threads)with Radeon RX6600 using the non-gaming version of the graphics drivers. It was originally a HEIC file when sent to me. The problem started around the time of this thread and no upgrades or changes were made to the PC's drivers before the problem started. The system was only weeks old at the time. Being that this is an AMD CPU and GPU which are arguably current models, if not the newest, I think that would rule out may of the driver and/or GPU concerns. I am also quite aware that hardware does fail even when new, but there are no other signs of failure in any other aspect of usage. This is also a very conservatively built machine with no overclocking outside of XMP. Memory is 32GB (2x16GB DDR4 3200). There are few additions and the software was all installed fresh with the build, no carry over from another machine. The conventional air cooling is almost excessive, but no unusual draws and the power supply is more than adaquate for the machine. Please see my last reply for some additional detail and questions. I forgot to include the example image previously. I am an experienced computer technician, though no one is infallible and don't pretend to be any guru, I've been around long enough to see(30years of Professional Hardware/Software troubleshooting) that this is a 90% chance of being a Lightroom Classic bug. Now, this could be due to it's interaction with some other software or limited to some specific image processing steps. We need to get into nitty gritty details to work this out. We can now rule out Intel or NVidia specificity to the error. Age of the hardware and/or drivers appears not to be the issue either. Now how about photo processing steps in these cases? What edits did you make prior to the issue in Lightroom Classic(if any)? Original resolution of images? File type of originals? Camera? Lens? I am not the photographer in this case or a photographer at all, but I will get this information from my client and post it ASAP as well.

GoldingD
Brainiac
July 14, 2023
quote

This was image was created on a new build AMD Ryzen 7 5800x (8 cores/16 threads)with Radeon RX6600 using the non-gaming version of the graphics drivers. It was originally a HEIC file when sent to me.


By @TVarkus

 

Fairly sure AI Denoise will not work on a HEIC file, only Bayer and Fuji X full (not compressed, not other than normal RAW) files. Not a DNG convert from a non standard RAW. (not a bug, but something Adobe is probably working on, AI Denoise is very new, just barkley not Beta)

 

Also, absolutely sure LrC handling of non iOS HEIC sucks. (not a bug, just something Adobe has not addressed yet)

Participating Frequently
July 14, 2023

I've been using Denoise on the Fuji losslessly compressed raw files from several bodies. I had the pink line problem with the files from the X-T2, but they are gone with the update to LrC 12.4. I agree that Denoise wouldn't be expected to be an option for HEIC files. I certainly can't use it on my HEIC files from the iPhone.