Skip to main content
Known Participant
January 1, 2024
質問

Lens Blur not working in LrC and Disables option to use Graphics Processor

  • January 1, 2024
  • 返信数 1.
  • 3531 ビュー

Initially the option in /preferences/performance/Use Graphics Processor is set to Auto.

 

Working with Develop Module on images works Ok with the exception of Lens Blur

 

When I try using Lens Blur, I get this error message:

 

LrC disables the option /preferences/performance/Use Graphics Processor which then slows all the other LrC processing, especially in Develop Module.

 

My System Info BEFORE using Lens Blur is as follows:

Lightroom Classic version: 13.1 [ 202312111226-41a494e8 ]

License: Creative Cloud

Language setting: en

Operating system: Windows 11 - Home Premium Edition

Version: 11.0.22631

Application architecture: x64

System architecture: x64

Logical processor count: 8

Processor speed: 2.3GHz

SqLite Version: 3.36.0

CPU Utilisation: 2.0%

Built-in memory: 16133.8 MB

Dedicated GPU memory used by Lightroom: 217.4MB / 1982.4MB (10%)

Real memory available to Lightroom: 16133.8 MB

Real memory used by Lightroom: 942.9 MB (5.8%)

Virtual memory used by Lightroom: 1262.1 MB

GDI objects count: 852

USER objects count: 2701

Process handles count: 2423

Memory cache size: 3.1MB

Internal Camera Raw version: 16.1 [ 1728 ]

Maximum thread count used by Camera Raw: 5

Camera Raw SIMD optimization: SSE2,AVX,AVX2

Camera Raw virtual memory: 62MB / 8066MB (0%)

Camera Raw real memory: 68MB / 16133MB (0%)

System DPI setting: 120 DPI

Desktop composition enabled: Yes

Standard Preview Size: 1440 pixels

Displays: 1) 1920x1080

Input types: Multitouch: Yes, Integrated touch: Yes, Integrated pen: No, External touch: No, External pen: No, Keyboard: Yes

Graphics Processor Info:

DirectX: NVIDIA GeForce MX250 (31.0.15.4633)

Init State: GPU for Image Processing supported by default

User Preference: Auto

 

This is my System Information showing my Graphics Processor Info AFTER trying to use Lens Blur:

Lightroom Classic version: 13.1 [ 202312111226-41a494e8 ]

License: Creative Cloud

Language setting: en

Operating system: Windows 11 - Home Premium Edition

Version: 11.0.22631

Application architecture: x64

System architecture: x64

Logical processor count: 8

Processor speed: 2.3GHz

SqLite Version: 3.36.0

CPU Utilisation: 6.0%

Built-in memory: 16133.8 MB

Dedicated GPU memory used by Lightroom: 2395.9MB / 1982.4MB (120%)

Real memory available to Lightroom: 16133.8 MB

Real memory used by Lightroom: 3512.7 MB (21.7%)

Virtual memory used by Lightroom: 3976.7 MB

GDI objects count: 956

USER objects count: 2692

Process handles count: 2499

Memory cache size: 3.1MB

Internal Camera Raw version: 16.1 [ 1728 ]

Maximum thread count used by Camera Raw: 5

Camera Raw SIMD optimization: SSE2,AVX,AVX2

Camera Raw virtual memory: 72MB / 8066MB (0%)

Camera Raw real memory: 83MB / 16133MB (0%)

System DPI setting: 120 DPI

Desktop composition enabled: Yes

Standard Preview Size: 1440 pixels

Displays: 1) 1920x1080

Input types: Multitouch: Yes, Integrated touch: Yes, Integrated pen: No, External touch: No, External pen: No, Keyboard: Yes

Graphics Processor Info:

DirectX: NVIDIA GeForce MX250 (31.0.15.4633)

Init State: GPU disconnected

User Preference: Auto

 

My catalog is on a SSD with 31 % Free space

 

My Camera RAW CACHE is set for 20 GB. it is on drive C

As you can see my GPU drive is v546.33 and per NVIDIA that is the latest as of 1 January 2024. I have only 2GB of VRAM

My Camera is a Sony A7R5 which has 60 MB RAW files.

 

Inquiry. Why is Lens Blur not working and why does LrC disable the use of the Graphics Processor when trying to Lens Blur? Is there another discussion or perhaps a Bug posting on this. Is there an ADOBE document on this. Help.

 

返信数 1

johnrellis
Legend
January 2, 2024

The "Something went wrong" error occurs when LR gets an unexpected result from the GPU and usually indicates it's tripping over a bug in the graphics driver. LR disables the use of the GPU to prevent the error from recurring.

 

"Dedicated GPU memory used by Lightroom: 217.4MB / 1982.4MB (10%)"

"DirectX: NVIDIA GeForce MX250 (31.0.15.4633)"

 

You've got the minimum amount of graphics memory specified in the LR system requirements:

https://helpx.adobe.com/lightroom-classic/system-requirements.html 

 

It may be that Lens Blur needs more than that.

 

Alternatively, there may indeed be a bug in the 546.33 Studio driver (just because it's "the latest" doesn't mean it's bug-free on all GPUs).  Infrequently, people have reported that installing older Studio drivers (e.g. 3, 6, or 9 months old) avoids such problems:

https://www.nvidia.com/Download/Find.aspx?lang=en-us 

jm345ps作成者
Known Participant
January 2, 2024

Thanks for the reply. Curious, are there other LrC functions that will not work with the Minimum GPU Requirements? 

Being Lens Blur is Early Release maybe Adobe can tweak it to work with the Minimum GPU Requirements, if that is the issue.

Unfortunately I do not see a Studio Driver for the GeForce MX250, only Game Ready drivers. I suppose I could try rolling back to the previous Game Ready driver to see if that makes a difference. 

It would great if Adobe can identity the cause of the problem and let us know if there is a likely solution. Thanks

johnrellis
Legend
January 2, 2024

People masking was well-known to have problems with less than the minimum about of graphics memory with Intel GPUs on older Mac laptops, which was the original spur for Adobe to publish that requirement. Based on years of reports here, I think the requirement is "soft" in that whether a problem occurs depends heavily on the GPU, the driver, the version of LR / Camera Raw, and the image.

 

Regarding drivers, I looked into the difference between Nvidia Studio and Game Ready drivers last year. It appears that there's a single sequence of driver versions. Game Ready versions are released frequently, several times a month on average. Once a month Nvidia claims to do extra testing with the latest version on some GPUs with some creative apps, and that version gets released as both a Studio and a Game Ready driver. So you could look at the older Studio drivers for a more recent GPU, write down the version numbers of a few, and then verify that you can download that same driver for the MX250 (which will show as a "Game Ready" driver).