Skip to main content
Inspiring
October 20, 2023
Question

'Use Graphics Processor' is set to 'Off' but the 'Enhance' feature ignores this SW setup!

  • October 20, 2023
  • 4 replies
  • 3207 views

Hello,

in Lightroom Classic 13.0.1 (on Windows 10 V22H2, 64 bit):

  1. Go to 'Prefernces -> Performance' and set the Use 'Graphics Processor' option to 'Off'.
  2. Now select some RAW picture of the LrC catalog and push 'Ctrl + Alt + I'.
  3. The 'Enhance Preview' window appears, now select 'Denois' and push the button 'Enhance'.
  4. Watch the Windows Task-Manager, the discrete GPU 1 is used (100 % load at the '3D' graph). Expected: The CPU built-in GPU 0 will be used.

The result: My laptop ends up with a BSOD (blue screen of death) because GPU 1 is defect.

 

Addendum: I reported this as 'Bug' because it is a obvious bug, now it is listed under 'Discussions'?!

This topic has been closed for replies.

4 replies

.Michael.Author
Inspiring
December 23, 2023

There was a dGPU fix, I haven't tried if it is also working for this equal bug. 

.Michael.Author
Inspiring
October 20, 2023

My current workaround: Assign the CPU internal GPU via the Window 10 system setting to LrC, see picture (in German language). Result: LrC uses the CPU and ignores the iGPU ... (as planned: the dGPU is also no longer in use). 

johnrellis
Legend
October 22, 2023

LR 13 finally allows you to disable its use of the graphics processor for the various AI commands (masking, enhance, lens blur):

https://helpx.adobe.com/camera-raw/kb/acr-gpu-faq.html#lens-blur

 

In my testing on my Mac, however, it only disables the use of the GPU for masking, not enhance or lens blur.  It will be good to see if it works for you.

.Michael.Author
Inspiring
October 22, 2023

Thank you johnrellis. Copied from your link, this chapter:

Troubleshoot GPU driver issues while using AI-powered features like Lens Blur

describes the required steps. The file 'DisableGPUInference.txt' has to be copied to some location and then all dGPU use should be off. I do not see a benefit to my workaround from above., i.e. I keep my "solution".

GoldingD
Legend
October 20, 2023

In LrC, click on Help, then System Information, Copy that, and paste it into a reply. Mind you I am interested in just two parts of the info, could you trim the reply down as to not create a lot of scroll space.

 

Interested in the info as follows, and mosly interested in that shown in bold red:

 

 

Lightroom Classic version: 13.0.1

License: Creative Cloud

Language setting: en

Operating system: Windows 11 - Home Premium Edition

Version: 11.0.22621

Application architecture: x64

System architecture: x64

Logical processor count: 32

Processor speed: 2.9GHz

SqLite Version: 3.36.0

CPU Utilisation: 0.0%

Built-in memory: 65243.7 MB

Dedicated GPU memory used by Lightroom: 109.2MB / 8032.0MB (1%)

Real memory available to Lightroom: 65243.7 MB

Real memory used by Lightroom: 4623.3 MB (7.0%)

Virtual memory used by Lightroom: 5404.9 MB

GDI objects count: 860

USER objects count: 3146

Process handles count: 2538

Memory cache size: 0.0MB

Internal Camera Raw version: 16.0 [ 1677 ]

Maximum thread count used by Camera Raw: 5

Camera Raw SIMD optimization: SSE2,AVX,AVX2

Camera Raw virtual memory: 706MB / 32621MB (2%)

Camera Raw real memory: 710MB / 65243MB (1%)

System DPI setting: 144 DPI (high DPI mode)

Desktop composition enabled: Yes

Standard Preview Size: 1440 pixels

Displays: 1) 3840x2160

Input types: Multitouch: No, Integrated touch: No, Integrated pen: Yes, External touch: No, External pen: No, Keyboard: No

 

Graphics Processor Info:

DirectX: NVIDIA GeForce RTX 3070 (31.0.15.3758)

Init State: GPU for Export supported by default

User Preference: GPU for Export enabled

 

.Michael.Author
Inspiring
October 20, 2023

@GoldingD I'm 99 % sure that this has nothing to do with my report but here you go:

Lightroom Classic version: 13.0.1 [ 202310121438-d2af310c ]
License: Creative Cloud
Language setting: en
Operating system: Windows 10 - Business Edition
Version: 10.0.19045

...

Dedicated GPU memory used by Lightroom: 0,0MB / 0,0MB (0%)

...

Graphics Processor Info:

Init State: GPU Unsupported
User Preference: Auto

Application folder: 

Legend
October 20, 2023

It is my understanding that AI Denoise in Lightroom Classic uses the GPU, regardless of the settings in Preferences. So if my understanding is correct, it's not a bug, it's working as it was designed, and so nothing to fix.

.Michael.Author
Inspiring
October 20, 2023

If there is an application global setup option (Use Graphics Processor = Off) which is partly ignored by a sub-feature of the applicaton then this is a SW-design bug of the sub-feature.

GoldingD
Legend
October 20, 2023
quote

If there is an application global setup option (Use Graphics Processor = Off) which is partly ignored by a sub-feature of the applicaton then this is a SW-design bug of the sub-feature.


By @.Michael.

 

I am sure that if an Adobe Tech replies, they will state this is as designed, and as such, not a bug. Anything they design poorly is not a bug in their minds.

 

Now, I assume, but I have not gone looking for it yet, that an Idea has been posted to correct that. Idea being to turn use of GPU on/off for enhanced features. Perhaps another member will reply.