Skip to main content
Known Participant
August 28, 2024
Question

Lightroom does not use the GPU GTX 3090.

  • August 28, 2024
  • 12 replies
  • 5275 views
I had a GTX 1080 TI card, I bought a new, efficient RTX 3090, the card works great in Benchmarks. 
I installed new drivers.
Lightroom Classic 13.5.
Driver Problem: the program does not use the card. It makes no difference whether: 1. Auto is enabled 2. Manual 3. Card support is disabled.

I checked the card on 2 drivers:
1. 560.81-desktop-win10-win11-64bit-international-nsd-dch-whql
2. 560.94-desktop-win10-win11-64bit-international-dch-whql
The effect is the same.

How can I solve the problem with exporting photos.
This topic has been closed for replies.

12 replies

Known Participant
August 28, 2024

During the screenshots, I exported photos from RAW to JPG. On the internet I found graphs that show GPU usage during RAW export to JPG. In my case it doesn't matter what I set.

LC only uses CPU. to exports.

GoldingD
Legend
August 28, 2024

Why do you sate that it is not in use. These sceenshots of usage are nice, but without context they are of no value. What were you doing in LrC when you looked at the GPU use? Were you busy in the Develop Module. Or were you in the Library Module? And does that use cover when LrC needed to use the GPU? 

 

Library Module, will not be bending the GPU

Develop Module may heavily use the GPU when callede for.

Perhaps watch the GPU graphs while attempting a AI Denoise, or an Enhance.