Copy link to clipboard
Copied
I'm using Lightroom Classic 12.5 and have an NVIDIA GTX1070 graphics card with the latest driver (537.42) on Windows 11 64-Bit Version 22H2 (Build 22621.2283). The NVIDIA card is the only graphics card in the system.
With GPU accelleration turned on in Lightroom Classic zooming and panning of the image causes huge delays (up to 10 seconds for a simple pan operation). Sometimes images are displayed horizontally stretched. Sometimes after switching from Library to Develop or from normal Develop into Crop mode the viewer just stays gray and no image is displayed at all.
Turning off GPU accelleration fixes all of the issues, but obviously isn't very fast, either (yet still better than with GPU accelleration on, right now, at least 🙂 )
I followed the KB articles for clean driver install, but so far nothing helped.
Any ideas would be greatly appreciated. Thank you!
Copy link to clipboard
Copied
In LrC, in the System Info, what is shown for the GPU?
an old, outdated perhaps example:
Graphics Processor Info:
DirectX: NVIDIA GeForce GTX 1070 Ti (31.0.15.1748)
Init State: GPU for Export supported by default
User Preference: Auto
Copy link to clipboard
Copied
Copy link to clipboard
Copied
"NVIDIA GTX1070 graphics card with the latest driver (537.42)"
That Studio driver is only three days old -- LR may be having problems with it. Try backing up to a Studio driver that's a couple months old:
https://www.nvidia.com/Download/Find.aspx?lang=en-us
To help others, let us know whether that helps.
Copy link to clipboard
Copied
I'll run some tests with older Studio and non-Studio drivers and get back on that. Entirely possible that there's issues, given we spent the last 3 months moving to a different country and after all that I just updated both my NVIDIA driver and LRC to whatever the latest versions were, and then the issues started ^.^
Copy link to clipboard
Copied
Ok, tried it with NVIDIA driver 536.40 today and it works fine.. so, seems NVIDIA changed something in the last few releases that somehow doesn't work well with LRC...
Copy link to clipboard
Copied
Good to know. This has happened a few times with both Nvidia and AMD drivers. LR's use of the GPU for AI computations is relatively new for desktop apps and uses the GPU much differently than the traditional graphics display of games and video editing, and LR repeatedly trips over bugs in the drivers for specific chipsets.