'Use Graphics Processor' is set to 'Off' but the 'Enhance' feature ignores this SW setup!
Copy link to clipboard
Copied
Hello,
in Lightroom Classic 13.0.1 (on Windows 10 V22H2, 64 bit):
- Go to 'Prefernces -> Performance' and set the Use 'Graphics Processor' option to 'Off'.
- Now select some RAW picture of the LrC catalog and push 'Ctrl + Alt + I'.
- The 'Enhance Preview' window appears, now select 'Denois' and push the button 'Enhance'.
- Watch the Windows Task-Manager, the discrete GPU 1 is used (100 % load at the '3D' graph). Expected: The CPU built-in GPU 0 will be used.
The result: My laptop ends up with a BSOD (blue screen of death) because GPU 1 is defect.
Addendum: I reported this as 'Bug' because it is a obvious bug, now it is listed under 'Discussions'?!
Copy link to clipboard
Copied
It is my understanding that AI Denoise in Lightroom Classic uses the GPU, regardless of the settings in Preferences. So if my understanding is correct, it's not a bug, it's working as it was designed, and so nothing to fix.
Copy link to clipboard
Copied
If there is an application global setup option (Use Graphics Processor = Off) which is partly ignored by a sub-feature of the applicaton then this is a SW-design bug of the sub-feature.
Copy link to clipboard
Copied
If there is an application global setup option (Use Graphics Processor = Off) which is partly ignored by a sub-feature of the applicaton then this is a SW-design bug of the sub-feature.
By .Michael.
I am sure that if an Adobe Tech replies, they will state this is as designed, and as such, not a bug. Anything they design poorly is not a bug in their minds.
Now, I assume, but I have not gone looking for it yet, that an Idea has been posted to correct that. Idea being to turn use of GPU on/off for enhanced features. Perhaps another member will reply.
Copy link to clipboard
Copied
I am sure that if an Adobe Tech replies, they will state this is as designed, and as such, not a bug. Anything they design poorly is not a bug in their minds.
...
By GoldingD
Yes I know this It's not a bug it's a feature. statement well.
Copy link to clipboard
Copied
If there is an application global setup option (Use Graphics Processor = Off) which is partly ignored by a sub-feature of the applicaton then this is a SW-design bug of the sub-feature.
By .Michael.
Design bugs (an error in the design) are not equal to bugs (an error in the program execution).
Copy link to clipboard
Copied
I don't know if you have tried this - but if you set the GPU acceleration option to Custom, some sub-options appear on what specific kinds of processing to employ the GPU for, or not.
Perhaps leaving GPU active but turning all options off may turn out OK. Stranger and even more unintuitive outcomes have been known!
I do agree that whenever GPU acceleration is apparently fully turned off, it's a reasonable user conclusion that it should be unused in all respects.
But the interrelation between OS and GPU is an intimate and complex one necessarily, and a defective GPU or an inappropriate driver for that may cause all sorts of bizarre mischief - which can often seem unrelated to graphics acceleration per se.
Copy link to clipboard
Copied
Your proposal
I don't know if you have tried this - but if you set the GPU acceleration option to Custom, some sub-options appear on what specific kinds of processing to employ the GPU for, or not.
By richardplondon
is a nice idea! The problem: At 1st start of LrC 13.0.1 I got an info message box that the GPU acceleration is disabled and I should enable it. I declined this. Now I'm at the 'Performance' tab sheet and the drop down list of the 'Use Graphics Processor' option is grayed out, i.e. I do not have the possibility to try it.
Copy link to clipboard
Copied
In LrC, click on Help, then System Information, Copy that, and paste it into a reply. Mind you I am interested in just two parts of the info, could you trim the reply down as to not create a lot of scroll space.
Interested in the info as follows, and mosly interested in that shown in bold red:
Lightroom Classic version: 13.0.1
License: Creative Cloud
Language setting: en
Operating system: Windows 11 - Home Premium Edition
Version: 11.0.22621
Application architecture: x64
System architecture: x64
Logical processor count: 32
Processor speed: 2.9GHz
SqLite Version: 3.36.0
CPU Utilisation: 0.0%
Built-in memory: 65243.7 MB
Dedicated GPU memory used by Lightroom: 109.2MB / 8032.0MB (1%)
Real memory available to Lightroom: 65243.7 MB
Real memory used by Lightroom: 4623.3 MB (7.0%)
Virtual memory used by Lightroom: 5404.9 MB
GDI objects count: 860
USER objects count: 3146
Process handles count: 2538
Memory cache size: 0.0MB
Internal Camera Raw version: 16.0 [ 1677 ]
Maximum thread count used by Camera Raw: 5
Camera Raw SIMD optimization: SSE2,AVX,AVX2
Camera Raw virtual memory: 706MB / 32621MB (2%)
Camera Raw real memory: 710MB / 65243MB (1%)
System DPI setting: 144 DPI (high DPI mode)
Desktop composition enabled: Yes
Standard Preview Size: 1440 pixels
Displays: 1) 3840x2160
Input types: Multitouch: No, Integrated touch: No, Integrated pen: Yes, External touch: No, External pen: No, Keyboard: No
Graphics Processor Info:
DirectX: NVIDIA GeForce RTX 3070 (31.0.15.3758)
Init State: GPU for Export supported by default
User Preference: GPU for Export enabled
Copy link to clipboard
Copied
@GoldingD I'm 99 % sure that this has nothing to do with my report but here you go:
Lightroom Classic version: 13.0.1 [ 202310121438-d2af310c ]
License: Creative Cloud
Language setting: en
Operating system: Windows 10 - Business Edition
Version: 10.0.19045
...
Dedicated GPU memory used by Lightroom: 0,0MB / 0,0MB (0%)
...
Graphics Processor Info:
Init State: GPU Unsupported
User Preference: Auto
Application folder:
Copy link to clipboard
Copied
My current workaround: Assign the CPU internal GPU via the Window 10 system setting to LrC, see picture (in German language). Result: LrC uses the CPU and ignores the iGPU ... (as planned: the dGPU is also no longer in use).
Copy link to clipboard
Copied
LR 13 finally allows you to disable its use of the graphics processor for the various AI commands (masking, enhance, lens blur):
https://helpx.adobe.com/camera-raw/kb/acr-gpu-faq.html#lens-blur
In my testing on my Mac, however, it only disables the use of the GPU for masking, not enhance or lens blur. It will be good to see if it works for you.
Copy link to clipboard
Copied
Thank you johnrellis. Copied from your link, this chapter:
Troubleshoot GPU driver issues while using AI-powered features like Lens Blur
describes the required steps. The file 'DisableGPUInference.txt' has to be copied to some location and then all dGPU use should be off. I do not see a benefit to my workaround from above., i.e. I keep my "solution".
Copy link to clipboard
Copied
Thanks, that's similar to my testing -- the DisableGPUInference.txt file disables the use of the GPU for AI masking but not Enhance, despite what the help article says.
Copy link to clipboard
Copied
That's a mistake, I did not try the Adobe workaround because my one is 100 % working and the Adobe workaround may not or only partly work. I.e. I have no reason to test it.
Copy link to clipboard
Copied
OK, thanks for the clarification. The advantage of Adobe's solution in general is that it allows LR to use the GPU for all the other uses other than the AI commands: displaying images on larger displays (like yours), rendering develop settings faster as you edit, exporting images faster.
Copy link to clipboard
Copied
Fine, it would help other users if the dGPU is working fine for all features except the AI features. My laptop has a defective dGPU (1st post) and I never mentioned my display properties.
Copy link to clipboard
Copied
There was a dGPU fix, I haven't tried if it is also working for this equal bug.

