Skip to main content
Known Participant
February 12, 2019

P: Enhance Details causes cut out colored squares

  • February 12, 2019
  • 92 replies
  • 1670 views

I was trying out the new Enhance Details function on Lightroom 8.2, and it is working well on some images, but on other images, it creates the new file, but totally messes it up by cutting out a square in one part of the image, and then other green, blue and magenta squares of the same size are placed randomly in other parts of the image.

This doesn't happen on all files (and it isn't consistent as to which files...two shots taken minutes apart with the same camera and lens, and one it won't process correctly, but the other it will.  If it messes up on a file, it always messes up in the same way if you try again. It doesn't matter whether GPU processing is enabled or not.


Example showing the effect: http://www.jordansteele.com/2019/enhanced.jpg

System Info:


Lightroom Classic version: 8.2 [ 1204643 ]
License: Creative Cloud
Language setting: en
Operating system: Windows 10 - Business Edition
Version: 10.0.17763
Application architecture: x64
System architecture: x64
Logical processor count: 8
Processor speed: 4.0 GHz
Built-in memory: 32726.0 MB
Real memory available to Lightroom: 32726.0 MB
Real memory used by Lightroom: 824.8 MB (2.5%)
Virtual memory used by Lightroom: 1044.2 MB
GDI objects count: 765
USER objects count: 2833
Process handles count: 1906
Memory cache size: 117.5MB
Internal Camera Raw version: 11.2 [ 134 ]
Maximum thread count used by Camera Raw: 5
Camera Raw SIMD optimization: SSE2,AVX,AVX2
Camera Raw virtual memory: 158MB / 16363MB (0%)
Camera Raw real memory: 158MB / 32726MB (0%)
System DPI setting: 96 DPI
Desktop composition enabled: Yes
Displays: 1) 2560x1440, 2) 1920x1200
Input types: Multitouch: No, Integrated touch: No, Integrated pen: No, External touch: No, External pen: No, Keyboard: No

Graphics Processor Info:
DirectX: NVIDIA GeForce RTX 2070 (25.21.14.1881)

This topic has been closed for replies.

92 replies

Participating Frequently
April 12, 2019
I did update to 10.14.4.  The convert time dropped to 24 seconds.  Now I just need to be convinced it is actually better.  So far, way to subtle for me.  Curious about when to apply it - before sharpening, after?
Adobe Employee
April 8, 2019
You would need to update to the latest macOS 10.14.4 to give a try. 
Participating Frequently
April 7, 2019
Interesting.  The bug is in Mojave but I'm running High Sierra.  Two out of three machines I have access to - all running High Sierra - apparently have the bug but the third, an iMac doesn't.  Entering the Twilight Zone here. 
johnrellis
Legend
April 7, 2019
"Which seems to say they 'see' it, just refuse to use it."

This could be a Mac OS bug, not a LR bug.  See this discussion: 
https://forums.adobe.com/message/11012837#11012837
Participating Frequently
April 7, 2019
Yes - that confirms the GPU is NOT being used.Here's what I'm seeing about the GPU under Help>System Info:
glgpu[0].GLVersion="2.1"glgpu[0].IsIntegratedGLGPU=0
glgpu[0].GLMemoryMB=8192
glgpu[0].GLName="AMD Radeon RX 580 OpenGL Engine"
glgpu[0].GLVendor="ATI Technologies Inc."
glgpu[0].GLVendorID=4098
glgpu[0].GLRectTextureSize=16384
glgpu[0].GLRenderer="AMD Radeon RX 580 OpenGL Engine"
glgpu[0].GLRendererID=16915464

Which seems to say they 'see' it, just refuse to use it. 
johnrellis
Legend
April 7, 2019
"So, I conclude Adobe doesn't use my GPU at all. "

Sounds like it.  You can easily verify that by starting Activity Monitor and doing the menu command Window > GPU History.
Participating Frequently
April 7, 2019
An update - I disabled Use GPU in ACR, did enhance detail - took 5:36.  Then enabled the GPU, did enhance detail on the same image - took 5:36.  So, I conclude Adobe doesn't use my GPU at all.  No matter what the preferences are.   Major Bummer. 
Participating Frequently
April 6, 2019
when I used a Mac Mini (2014) I5proceesor with 16Mb ram it took 4 minutes to process and enhanced detail DNG file.
That same file processed on a MacBook Pro (2018) with a Radeon 560 GPU and 16Mb ram did the same in only 8 seconds!!
I am convinced that a fast GPU makes all the difference.  The folks at Lightroom recommend a dedicated GPU for this process.
george
Participating Frequently
April 6, 2019

I updated my Mac Pro with a Sapphire Pulse Radeon RX 580 GPU. System Info reports:  Metal:    Supported, feature set macOS GPUFamily1 v3. The Geekbench performance is significantly better (*10) than the original card.  I'm running 10.13.6.  


Enhance Details still fails with colored squares all over the image.  And it takes a long time - on the order of 3 or 4 minutes.  


I have access to a 2017 iMac with a "Radeon Pro 580" graphics card.  It's running 10.13.6 also.   It's Geekbench score is slightly less than my Mac Pro now.  Here's the deal, however.  The iMac runs enhance detail in 30 seconds or so with no squares.  (We can talk later about if I can a significant difference.)


One idea is CPU speed matters a lot.  My Mac Pro is a 2.8 GHz Quad-Core Intel Xeon while the iMac is a 3.8Ghz Intel Core i5.  Whatever the problem, I can't see that Lightroom or ACR are using the GPU at all.  I have 'Use GPU' checked and Adobe "System Info" reports:

Graphics Processor Info: 

Metal: AMD Radeon RX 580


Any ideas how to fix this?  As to Mojave - I saw something that said it requires "Metal 2"  True? 
Adobe Employee
March 27, 2019
Please follow https://helpx.adobe.com/lightroom/kb/lightroom-gpu-faq.html Solution #4 to do a CLEAN driver install. If you have put in place any workarounds previously, undo it (no forcing to OpenGL).