Copy link to clipboard
Copied
My Lightroom cannot turn on the GPU full graphics acceleration function. Currently, only basic graphics acceleration can be used, and "Use GPU to process images" cannot be checked. How can I turn it on?
Attach system information:
Lightroom Classic version: 11.2 [ 202201281441-a5b5f472 ]
Licensing: Creative Cloud
Language setting: zh_tw
Operating System: Windows 10 - Business Edition
Version: 10.0.19044
Application Architecture: x64
System Architecture: x64
Number of logical processors: 16
Processor Speed: 3.5GHz
SqLite version: 3.36.0
Built-in memory: 65479.8 MB
Physical memory available for Lightroom: 65479.8 MB
Physical memory used by Lightroom: 2828.8 MB (4.3%)
Virtual memory used by Lightroom: 3648.6 MB
GDI Object Count: 854
USER Object Count: 1988
Program Process Count: 3668
Memory Cache Size: 147.2MB
Internal Camera Raw version: 14.2 [ 1028 ]
Maximum number of threads used by Camera Raw: 5
Camera Raw SIMD optimization: SSE2,AVX,AVX2
Camera Raw virtual memory: 1291MB / 32739MB (3%)
Camera Raw Physical Memory: 1458MB / 65479MB (2%)
System DPI setting: 96 DPI
Desktop composition enabled: yes
Display: 1) 2560x1440, 2) 2560x1440
Input Type: Multi-Touch: No, Integrated Touch: No, Integrated Stylus: No, External Touch: No, External Stylus: No, Keyboard: No
Graphics processor information:
DirectX: AMD Radeon RX 6600 XT (30.0.15002.1004)
Copy link to clipboard
Copied
Try updating to the latest Adrenalin Edition graphics driver 22.3.1
Copy link to clipboard
Copied
Thank you for your reply, I just confirmed that the AMD driver version is 22.3.1 (released on 2022/03/09), but still cannot enable full graphics acceleration
Copy link to clipboard
Copied
Try performimg a clean install using the Adrenalin Edition graphics driver 22.3.1 as outlined in the below article.
Copy link to clipboard
Copied
Thanks for your reply!
I did the following according to the website you provided:
1. Use DDU and remove the previous NVIDIA graphics card driver (1050Ti) and AMD RX6600XT driver in safe mode
2. Reinstall the RX6600XT driver (version 22.3.1) using "AMD Software꞉ Adrenalin Edition"
3. Lightroom still cannot turn on full acceleration, decided to remove Lightroom and reinstall it
4. Open Lightroom again, but still cannot enable GPU acceleration for image processing
Copy link to clipboard
Copied
Thank you.
Yes. After enabling the GPU lightroom works without issues. But my AMD(RX) GPU just eating the power and money on my table.
From my personal & our team's experience, AMD GPU does Not support Lightroom, Photoshop or video editing.
Copy link to clipboard
Copied
"NVIDIA graphics card driver (1050Ti)"
I did not see this GPU lsied in the System Infor you posted. This may be causing the issue. LrC can use two GPUs, but it does not increase performance. What GPU do you have the two displays connected to? See the below article.
https://helpx.adobe.com/lightroom-classic/kb/lightroom-gpu-faq.html
Two things to try.
1) Use only the AMD RX 6600 XT with both displays plugged into it and none into the Nvidia 1050 Ti.
2) If that works do a clean install of the latest Nvidia 1050 Ti Studio driver 511.65.
Copy link to clipboard
Copied
Sorry for the misunderstanding.
Before replacing the AMD RX6600XT, I was using the NVIDIA 1050Ti graphics card.
The above description is only to express that I have used DDU to clear all the display drivers that have been installed on this computer, and then follow the instructions on the website to install the latest version of the AMD graphics card driver.
The computer has only one AMD RX6600XT graphics card.
sorry again, thank you
Copy link to clipboard
Copied
What display models and cabling are you using?
Copy link to clipboard
Copied
The monitor I use: DELL UP2716D *2
Both monitors are connected to the DP port of the RX6600XT with a DP cable
Copy link to clipboard
Copied
Hi, sorry to just jump in like this, but I might have the same problem. Difference is, I'm using a 6900xt. Checking the Camera Raw GPU logs, I found out that my GPU seems to fail the accelerated render quality test.
This is the end of the log output:
Begin cr_gpu_test::RunSanity (quick = False, doing init = True).
Bounds of image A are: 1000 and 1000
Bounds of image B are: 1000 and 1000
*** Error: GPU failed accelerated render quality test ***
*** Error: RunSanity failed ***
*** GPU Error: GPU self tests failed ***
Windows cr_gpu_view backing scale: h= 1.2500, v= 1.2500
first frame index is: 0
GPU Init Status (part 2): I2_GPU2
GPU3 Hard Status Result (part 2): fail_direct_sanity_test
GPU3 Soft Status Result (part 2): success
TrimFootprint: 1455.2 MB purged from pools.
Terminate GPU system
Dx Device Shutdown: AMD Radeon RX 6900 XT
Resource count and estimated mem use:
cr_thread: 0 ( 0.000 MB)
cr_context: 0 ( 0.000 MB)
cr_negative: 0 ( 0.000 MB)
cr_image: 0 ( 0.000 MB)
cr_image(c): 0 ( 0.000 MB)
buffers: 1 ( 0.003 MB)
vm: 0 / 0 MB
cr_sdk_terminate
Does this problem still persist for you? If so, mind checking the logs in
C:\Users\*your pc username here*\AppData\Roaming\Adobe\CameraRaw\Logs
to see if you have the same error?
Copy link to clipboard
Copied
Make sure to also check the Lightroom Classic Log Latest v1.text as shown below for my GPU.
Copy link to clipboard
Copied
I'm pretty sure it's also in the part that I posted. I should have clarified that the attached log is indeed from the "Lightroom Classic Log Latest v1.txt", unless you are talking about something else:
After the cr_sdk_terminate the file ends
Copy link to clipboard
Copied
Hi. Have you found a solution yet? I'm encountering this issue with my rx6600 too.
Copy link to clipboard
Copied
Copy link to clipboard
Copied
I changed the self test value from false to true in the config file and somehow it just worked. I guess it's just wired amd compatibility issue.
Copy link to clipboard
Copied
Hi,
Can you please share which file you edited and how ?
Copy link to clipboard
Copied
C:\Users\<youUserName>\AppData\Roaming\Adobe\CameraRaw\GPU\Adobe Photoshop Lightroom Classic
I set the crs:gpu_compute_quick_self_test_passed from "False" to "True", saved the file and I can now select full acceleration.
Need to test though if it really utilizes the GPU.
Copy link to clipboard
Copied
Hi guys,
I also experienced same issue (apparently), will write shortly my "version" of it: I bought a GPU ASUS Radeon RX6600 Dual to be able to leverage the GPU processing in LrC/24. After initial setup (let the windows update / AMD utility install the lastest driver (32.0.12011.1036), everthing worked fine. After 2 days of fun and joy from the new capability, it suddenly stopped working (Lr sometimes froze after processsing images before, but always "recovered".). I tried everything written above, but mine config file seemed to be "ok". The self_test_passed value was "True". But I noticed the field gpu_preferred_system being empty (""). So I simply copied the name of the GPU (from the log) into this line and it started working again. Just to mention, I'm also using my onboard VGA (some older Intel stuff), to which my first 2 displays (iiyama) connect and I connected my 4K SONY TV to this Radeon VGA. I use windows 10 (a corporate installation, where I can't control all the updates manually).
Copy link to clipboard
Copied
I'm heavily researching this, is there something causing the 6000 series to not pass the self check? I am 99% sure a few months ago (3-6) my 6700 XT was fully supported in Lightroom Classic).
Copy link to clipboard
Copied
Use driver ver 22.2.3, it's works
Copy link to clipboard
Copied
Copy link to clipboard
Copied
I'm on macOS 13.0 Ventura and that driver is currently not in this build. It was working fine with the last version of Monterey. Right now, only basic disply gpu acceleration is working. This also affects Adobe Bridge 2023 but not Photoshop 2023 on macOS.
Copy link to clipboard
Copied
Hi, Has there been an update to correct this issue. Looking at EGPU for Mac Mini with 6600 XT so would be interested to know if this has been resolved. Thanks.
Copy link to clipboard
Copied
I am also facing this issue. After turning off GPU acceleration in Lightroom, it works without issues. My question is " If we are working without GPU support Why spend around 600$ for AMD GPU?" Why don't AMD find the solution for this issue? Isn't this an unnecessary waste of money and Power?