Copy link to clipboard
Copied
Lightroom Classic auto-updated to 13.5 and since the update, my AMD Radeon RX6600XT no longer works properly.
The application opens fine, it works for a few images, then some glitching starts to appear and gets progressively worse as I go through the filmstrip, and renders the application unusable as it ends up looking like modern posterized glitch art. The images are unrecognizable.
Anyone else having the same issue? It goes away after I turn the GPU off but obviously that's a temporary solution as it's a whole lot more sluggish that way.
The rollback to 13.4 seems to have solved the issue but it's only a matter of time until I have to update again.
Attaching 2 files to demonstrate the issue.
Copy link to clipboard
Copied
Undoubtedly related to the problem that Mac users with AMD radeon cards have experienced since the last version with all kinds of visual glitches and for which there is no fix except disabling the GPU for everything but image display.
Copy link to clipboard
Copied
Yes, it seems that keeping it on "display" doesn't cause the glitches, but I did have complete freezes since posting here, where everything just stops and the application becomes unresponsive. It doesn't say that it's unresponsive, it just behaves as you are clicking into empty space. Had to force-shut through task manager, 3 times. Weirdly, after it happened the last time, I have been working succesfully for hours. I kept the feature on just to see if it would persist.
Copy link to clipboard
Copied
The rollback to 13.4 seems to have solved the issue but it's only a matter of time until I have to update again."
It's good you were able to avoid the issue by rolling back the Lightroom Classic version. It's good practice to turn off Auto Updates. By visiting this forum you will know when you won't have issues with new versions. You don't necessarily have to update Lightroom Classic until you are sure that the problem has been fixed. You can learn that by checking on this fourm.
Copy link to clipboard
Copied
I don't use auto-update for Lightroom, however I recently reinstalled my entire OS, and upon installing Creative Cloud the application decided to reset that, and so my guess is that auto-update = ON is the default option. This didn't use to be a thing at least not that I would remember, so it didn't occur to me to manually uncheck that.
"By visiting this forum you will know when you won't have issues with new versions."
Not that you are wrong for saying this, but if I have to preemptively visit forums to look for potential technical issues, that's just terrible user experience.
I would rarely ever update at all if not for some of the recent features that were worthy of updates - namely AI Denoise and AI removal. Other than that I would be fine with versions from years ago.
Not updating for myself is one thing, however I work as an editor, and some of the update versions have brought new catalog versions with them as well. If my client updates and it creates a new catalog version, I can't simply opt out of updating.
Copy link to clipboard
Copied
Ensure you have the most recent graphics driver by going directly to the manufacturer's web site:
https://helpx.adobe.com/lightroom-classic/kb/troubleshoot-gpu.html#solution-4
Then check the AMD Adrenaline utility. It has a nasty habit of assigning an "optimized" profile to LR, which can create havoc -- disable any assigned profile:
If that doesn't help, please copy/paste here the entire contents of the LR menu command Help > System Info -- that will let us see exactly which versions of hardware and software LR thinks you're running.