Copy link to clipboard
Copied
Can anyone help me or shed some light why Lightroom Classic is adding these weird blocks and artifacts to my pictures once run through Denoise. There were no edits applied prior to denoise. These are on other parts of the photo as well.
Lightoom is upto date along with all drivers. My system shouldn't be the issue but here's the details
DELL XPS 17
NVIDIA GeForce RTX 3060 Laptop GPU
Driver 531.79
11th Gen Intel(R) Core(TM) i7-11800H @ 2.30GHz
32 GB RAM
Plenty of storage space
I only have very basic knowledge of Lightroom so it might be an obvious setting somewhere, but if someone can help I'd be very grateful.
Images are .arw files
Copy link to clipboard
Copied
My turn... I'll need help if its a AMD Radeon 5500 update to firmware... Any and all help appreciated.
Reprot info here: seeing these blocks today...
Lightroom Classic version: 13.2 [ 202402141005-bf1aeb84 ]
License: Creative Cloud
Language setting: en-US
Operating system: Mac OS 14
Version: 14.4.1 [23E224]
Application architecture: x64
Logical processor count: 16
Processor speed: 3.8GHz
SqLite Version: 3.36.0
Built-in memory: 40,960.0 MB
Dedicated GPU memory used by Lightroom: 7,461.2MB / 8,176.0MB (91%)
Real memory available to Lightroom: 40,960.0 MB
Real memory used by Lightroom: 8,438.3 MB (20.6%)
Virtual memory used by Lightroom: 56,301.4 MB
Memory cache size: 291.3MB
Internal Camera Raw version: 16.2 [ 1763 ]
Maximum thread count used by Camera Raw: 5
Camera Raw SIMD optimization: SSE2,AVX,AVX2
Camera Raw virtual memory: 1528MB / 20479MB (7%)
Camera Raw real memory: 1541MB / 40960MB (3%)
Cache1:
Preview1- RAM:22.0MB, VRAM:0.0MB, _ATR4800.CR2
Preview2- RAM:22.0MB, VRAM:0.0MB, _ATR4801.CR2
Preview3- RAM:22.0MB, VRAM:0.0MB, _ATR4797-Enhanced-NR.dng
NT- RAM:66.0MB, VRAM:0.0MB, Combined:66.0MB
Cache2:
final1- RAM:248.0MB, VRAM:0.0MB, _ATR4801.CR2
final2- RAM:257.0MB, VRAM:270.0MB, _ATR4800.CR2
final3- RAM:248.0MB, VRAM:0.0MB, _ATR4797-Enhanced-NR.dng
T- RAM:753.0MB, VRAM:270.0MB, Combined:1,023.0MB
Cache3:
m:291.3MB, n:112.8MB
U-main: 100.0MB
Standard Preview Size: 2048 pixels
Displays: 1) 5120x2880
Graphics Processor Info:
Metal: AMD Radeon Pro 5500 XT
Init State: GPU for Export supported by default
User Preference: Auto
Application folder: /Applications/Adobe Lightroom Classic
Library Path: /Users/robertcatania/Pictures/Lightroom/Lightroom Catalog-v13.lrcat
Settings Folder: /Users/robertcatania/Library/Application Support/Adobe/Lightroom
Installed Plugins:
1) AdobeStock
2) Aperture/iPhoto Importer Plug-in
3) Flickr
4) Nikon Tether Plugin
Config.lua flags: None
Copy link to clipboard
Copied
OS Version: macOS 14.4.1 (23E224)
Graphics Processor Info: Metal: AMD Radeon Pro 5500 XT - 8 GB
Operating system: Mac OS 14
Version: 14.4.1 [23E224]
Metal: AMD Radeon Pro 5500 XT
It appears that Mac OS 14.4.1 includes a buggy graphics driver for the AMD 5300 and 5500 series graphics processors. Many people have reported other symptoms with that combination:
But the same bad driver might be causing your symptoms as well.
One workaround for those other users is restarting LR. Does that (at least temporarily) cause Denoise to run correctly?
Another workaround for those users is to set Preferences > Performance > Use Graphics Processor to Off. Unfortunately, that won't help you, because Adobe inexplicably refuses to allow users to disable LR's use of the GPU for AI commands like Denoise.
Copy link to clipboard
Copied
Copy link to clipboard
Copied
@sandyd40372345, please do the LR menu command Help > System Info and copy/paste the entire contents here so we can see exactly which versions of hardware and software LR thinks you're running and important LR options that are set.
Copy link to clipboard
Copied
I am getting pink blocks! The following is my System info.
Lightroom Classic version: 13.3 [ 202405092057-40441e28 ]
License: Creative Cloud
Language setting: en-US
Operating system: Mac OS 14
Version: 14.5.0 [23F79]
Application architecture: x64
Logical processor count: 16
Processor speed: 3.8GHz
SqLite Version: 3.36.0
Built-in memory: 8,192.0 MB
Dedicated GPU memory used by Lightroom: 5,212.9MB / 8,176.0MB (63%)
Real memory available to Lightroom: 8,192.0 MB
Real memory used by Lightroom: 692.7 MB (8.4%)
Virtual memory used by Lightroom: 44,524.7 MB
Memory cache size: 311.8MB
Internal Camera Raw version: 16.3 [ 1863 ]
Maximum thread count used by Camera Raw: 5
Camera Raw SIMD optimization: SSE2,AVX,AVX2
Camera Raw virtual memory: 1388MB / 4095MB (33%)
Camera Raw real memory: 1418MB / 8192MB (17%)
Cache1:
Final1- RAM:352.0MB, VRAM:0.0MB, _I0A0753.CR3
Final2- RAM:355.0MB, VRAM:48.0MB, _I0A0753-Enhanced-NR-3.dng
Preview3- RAM:22.0MB, VRAM:0.0MB, _I0A0753-Enhanced-NR.dng
Final4- RAM:352.0MB, VRAM:0.0MB, _I0A0753-Enhanced-NR-2.dng
Preview5- RAM:22.0MB, VRAM:0.0MB, _I0A0751.CR3
Preview6- RAM:22.0MB, VRAM:0.0MB, _I0A0752.CR3
NT- RAM:1,125.0MB, VRAM:48.0MB, Combined:1,173.0MB
Cache2:
m:311.8MB, n:1,123.9MB
U-main: 76.0MB
Standard Preview Size: 5120 pixels
Displays: 1) 5120x2880
Graphics Processor Info:
Metal: AMD Radeon Pro 5500 XT
Init State: GPU for Export supported by default
User Preference: Auto
Application folder: /Applications/Adobe Lightroom Classic
Library Path: /Users/sharonfazio/Pictures/Lightroom/Lightroom Catalog-v13-3.lrcat
Settings Folder: /Users/sharonfazio/Library/Application Support/Adobe/Lightroom
Installed Plugins:
1) AdobeStock
2) Aperture/iPhoto Importer Plug-in
3) DxO PureRAW 3
4) DxO PureRAW 3 Importer
5) Flickr
6) Luminar Neo
7) Nikon Tether Plugin
8) Topaz Photo AI
Config.lua flags:
Copy link to clipboard
Copied
"Operating system: Mac OS 14
Version: 14.5.0 [23F79]
Metal: AMD Radeon Pro 5500 XT"
Mac OS 14.4.1 included buggy graphics drivers for the AMD Radeon 5000 series graphics processors:
Adobe says they're working with Apple and AMD on the issue. Most people report images filled with "static", though there have been some reports about problems with the AI commands including Denoise, which haven't been merged there.
Many say that restarting their computer makes the problem go away at least for a while. The other workaround, setting Preferences > Performance > Use Graphics Processor to Off, won't help you, since Adobe inexplicably refuses to have the AI commands obey that setting -- the commands always use the GPU regardless.
Copy link to clipboard
Copied
I’m having an unusual problem with Denoise in Lightroom Classic 13.3. It worked fine for me in earlier versions. I’m using an Apple iMac with the latest Apple operating system. The problem is that Denoise eliminates the noise in my photos, but it adds large square artifacts, some of solid colors, some filled with gibberish. This happens pretty consistently and makes it almost impossible for me to use Denoise. I would be happy to provide samples of the problem if that would help. I welcome suggestions.
Copy link to clipboard
Copied
If you have an Intel iMac with AMD graphics, this is a known issue. There are also problems on Windows.
Copy link to clipboard
Copied
@gcmacken, please do the LR menu command Help > System Info and copy/paste the entire contents here so we can see exactly which versions of hardware and software LR thinks you're running and important LR options that are set.
Copy link to clipboard
Copied
I am having the same artifact issue on an iMac. It is not consistent - sometimes it occurs, and sometimes it doesn't. Based on completely subjective observations, it appears to generate more artifacts as the noise reduction slider is increased. Restarting LR or the computer does not resolve the problem.
3.8 GHz 8-Core Intel Core i7
AMD Radeon Pro 5500 XT 8 GB
32 GB 2667 MHz DDR4
macOS 14.6.1 (23G93)
Any help in resolving this would be greatly appreciated.
Copy link to clipboard
Copied
"3.8 GHz 8-Core Intel Core i7
AMD Radeon Pro 5500 XT 8 GB
macOS 14.6.1 (23G93)
Restarting LR or the computer does not resolve the problem."
Apple released a buggy graphics driver for the AMD 5000-series GPUs in Mac OS 14.4.1. This causes display artifacts in normal editing and export and for some like you, problems with the AI commands.
Unfortunately, setting Preferences > Performance > Use Graphics Processor to Off helps with the display and editing artifacts but not with the AI commands, since Adobe in its wisdom refuses to provide an option to disable their use of the GPU.
You could try updating to Mac OS 15. Some have reported this fixes issues with the 5000-series GPUs, while there a couple reporting it doesn't.
Copy link to clipboard
Copied
Not a fix as such but may help - I have found that the problem is less troublesome if I denoise in batches.
I may be have a 5% failure rate on a batch of say 200 images [I shoot sports in low light]
Applying denoise to single images has a to date 100% failure rate.
This has been the only fix that has somewhat worked for me.
Thus far I have , Upgraded OS [Sonoma 14.6.1.
Turned off the graphics card for LR
Stood on one leg with my arm in the air
Hopped on one foot and howled at the moon.
Burnt sage.
I don't think anything works tbh and Adobe doesn't seem to care
My new Imac arrives next week with an M4 chip so hopefully the issue will have been addressed
Copy link to clipboard
Copied
@nealethesnapper: "Upgraded OS [Sonoma 14.6.1. Turned off the graphics card for LR"
Many others have reported that versions of Mac OS from 14.4.1 through the last version of 14 didn't change the misbehavior. But there have been a fair number of reports that Mac OS 15 solves the problems with the AMD 5000-series GPUs on older Intel Macs (and a couple of reports that it doesn't).
Setting Preferences > Performance > Use Graphics Processor to Off worked around the display and editing artifacts many people saw on those GPUs with Mac OS 14. But Adobe in its wisdom refuses to have the AI commands (masking, denoise, lens blur) obey the settings of Use Graphics Processor. So that won't help you.
Copy link to clipboard
Copied
> My new Imac arrives next week with an M4 chip so hopefully the issue will have been addressed
This really only was an issue with AMD radeon GPUs in intel macs. It is not an issue oin any of the Apple Silicon machines.
Copy link to clipboard
Copied
I had the same issue. I went through a massive ordeal with Asus. They wouldnt fix the issue properly. After months downloading new gpu drivers, going back to old gpu drivers, reinstalling every driver from the ASUS website, applying new thermal paste twice, taking step by step videos of realtime artifacting from blender renders / lightroom crashing / photoshop camera raw crashing / adjust lightroom settings / reseting software preferences / PERFORMING A CLEAN WINDOWS 10 pro INSTALL / adjusting color profiles / uploading numerous photos with proof to ASUS support, sending the laptop to the repair center for a false diagnosis(no help), applying new thermal paste a third time.....nothing worked. The GPU or CPU overheated while rendering 3D files for an animation. The laptop works but i cannot properly produce graphics content without worrying about crashing or destroyed images. The laptop advertises as high heat tolerant. Nope!
Asus W73G5T XH99
Nvidia Quadro RTX 5000 16GB
Intel Xeon E-2276M @ 2.80GHZ up to 4.7ghz
64GB RAM
4tb ssd
Find more inspiration, events, and resources on the new Adobe Community
Explore Now