• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

GEFORCE GT 1030 slows down lighroom.

Community Beginner ,
Dec 08, 2019 Dec 08, 2019

Copy link to clipboard

Copied

I Run lightroom Classic on a Dell XPS 8500. always latest updates of all software and operating system. I never noticed a problem with the following configuration until the 9 update when everything noticeably slowed down. I discovered that my Nvidia 620 GPU was more than likely the cause of the problem as it did not support DirectX12. Disabling it was better but still not up to previous speed when making adjustments or especially cropping where the boarders lagged the mouse just enough to make it very annoying. I replaced the GPU with  GEFORCE GT1030, not a blazing fast card by any means but with 2 GB of 2GDDR5 memory & DirectX12 support it appears to exceed Lightroom Classic 9 minimum requirements. Lightroom does recognize the GPU and with "use graphics processor" set to "auto" Lightroom says "your system supports basic acceleration" and identifies the card correctly but slows down editing and especially cropping just like the 620. Setting it to "off" restores some speed. I ran benchmark tests on the GPU and it is working exactly up to spec. so what is the truth about GPU cards and Lightroom Classic? Do you really need a high-end gaming card to make a difference? Are budget cards that do meet requirements a waste of money? Running “Enhance Details” on a single NEF file took about 35 seconds set to "Auto”, "Custom" or “off.” 

Here is the configuration of my computer.

Thank you

 

BIOS Vendor :

Dell Inc.

BIOS Version :

A14

BIOS Date :

6/25/18

Windows Version :

Microsoft Windows 10 Home

 Processor

Processor :

Intel(R) Core(TM) i7-3770 CPU @ 3.40GHz

Clock Speed :

3.4Ghz

L2 Cache Size :

1024

 Memory

Available Memory :

49.02%

Page File Size :

18,777.0MB

Available Page File :

95.22%

Virtual Memory :

18,777.0MB

Available Virtual Memory :

38.92%

DIMM3 :

4,096.0MB

DIMM1 :

4,096.0MB

DIMM4 :

4,096.0MB

DIMM2 :

4,096.0MB

 

Generic- SD/MMC USB Device

WDC WD10EZRX-00A8LB0

Crucial_CT250MX200SSD3

WD My Passport 25E2 USB Device

M4-CT128M4SSD2

Generic- MS/MS-Pro USB Device

WDC WD2003FZEX-00Z4SA0

Generic- SM/xD-Picture USB Device

Display Adapters :

NVIDIA GeForce GT 1030

Keyboards, Mice & Pointing Devices :

HID-compliant mouse

Monitors :

BenQ GW2765

Sound Devices :

Realtek High Definition Audio

High Definition Audio Device

NVIDIA Virtual Audio Device (Wave Extensible) (WDM)

NVIDIA High Definition Audio

USB Controllers :

Intel(R) USB 3.0 eXtensible Host Controller - 1.0 (Microsoft)

Intel(R) 7 Series/C216 Chipset Family USB Enhanced Host Controller - 1E2D

Intel(R) 7 Series/C216 Chipset Family USB Enhanced Host Controller - 1E26

 

Views

2.4K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Dec 08, 2019 Dec 08, 2019

Copy link to clipboard

Copied

You're better off leaving the GPU off for Lightroom. The graphics acceleration seems to make a bigger difference if you're using a 4K monitor or higher resolution. For standard 1080p it seems to be better with the graphics acceleration off.

As you have seen for yourself.

Budget cards are not a waste of money - you get what you pay for. Unless you do a lot of video editing then the card you have chosen is fine! Video editing does require a high performance card though - especially if you do a lot.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Dec 08, 2019 Dec 08, 2019

Copy link to clipboard

Copied

Thanks for the quick response. I suspected as much. Thing is, I will probably replace this PC within 2 years so would not have replaced the GPU except that I wanted to be LR compliant to speed things up a bit. Adobe needs to be clearer about GPU recommendations and possible performance gains.

Sent from my iPhone

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Dec 08, 2019 Dec 08, 2019

Copy link to clipboard

Copied

Lots of good info on the System Information, unfortunately a few things missing or hiding in plane site. OS version/build, GPU driver version. Catalog path. Plug-ins.

 

Please share the info over again, this  time as Lightroom reports it. In Lightroom click on Help, then System Information, then Copy, and paste into a reply.

 

Please share that info from first line down to just past plug-in info, The stuff after plug-in info a bit too techno for most of us.

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Dec 08, 2019 Dec 08, 2019

Copy link to clipboard

Copied

Lightroom Classic version: 9.0 [ 201910151439-b660523e ]

License: Creative Cloud

Language setting: en

Operating system: Windows 10 - Home Premium Edition

Version: 10.0.17763

Application architecture: x64

System architecture: x64

Logical processor count: 8

Processor speed: 3.3 GHz

Built-in memory: 16344.9 MB

Real memory available to Lightroom: 16344.9 MB

Real memory used by Lightroom: 1025.9 MB (6.2%)

Virtual memory used by Lightroom: 2773.9 MB

GDI objects count: 980

USER objects count: 2054

Process handles count: 2506

Memory cache size: 0.0MB

Internal Camera Raw version: 12.0 [ 321 ]

Maximum thread count used by Camera Raw: 5

Camera Raw SIMD optimization: SSE2,AVX

Camera Raw virtual memory: 518MB / 8172MB (6%)

Camera Raw real memory: 528MB / 16344MB (3%)

System DPI setting: 96 DPI

Desktop composition enabled: Yes

Displays: 1) 2560x1440

Input types: Multitouch: No, Integrated touch: No, Integrated pen: No, External touch: No, External pen: No, Keyboard: No

 

Graphics Processor Info:

DirectX: NVIDIA GeForce GT 1030 (26.21.14.4141)

 

 

 

Application folder: C:\Program Files\Adobe\Adobe Lightroom Classic

Library Path: H:\Lightroom\Lightroom Catalog-2-2.lrcat

Settings Folder: C:\Users\Ken\AppData\Roaming\Adobe\Lightroom

 

Installed Plugins:

1) AdobeStock

2) Facebook

3) Flickr

4) Nikon Tether Plugin

 

Config.lua flags: None

 

Adapter #1: Vendor : 10de

                Device : 1d01

                Subsystem : 375c1458

                Revision : a1

                Video Memory : 1982

Adapter #2: Vendor : 1414

                Device : 8c

                Subsystem : 0

                Revision : 0

                Video Memory : 0

AudioDeviceIOBlockSize: 1024

AudioDeviceName: BenQ GW2765 (NVIDIA High Definition Audio)

AudioDeviceNumberOfChannels: 2

AudioDeviceSampleRate: 48000

Build: 12.1x4

Direct2DEnabled: false

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Dec 08, 2019 Dec 08, 2019

Copy link to clipboard

Copied

When you installed your new GPU driver, did you do a clean install? This involves the install utility, it involves allowing it to completly remove the old driver, and install a new one.

 

As you are going from NVIDIA to NVIDIA this can be accomplished during the install (via custom install and options) 

 

https://nvidia.custhelp.com/app/answers/detail/a_id/2900/kw/clean%20install/session/L3RpbWUvMTU3NTgy...

 

 

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Dec 08, 2019 Dec 08, 2019

Copy link to clipboard

Copied

Thanks David,

I tried multiple clean installs with drivers from both NVIDIA and from GIGABIT ( AORUS ENGINE) Even reinstalled the physicial card and replaced the HDMI cable. Always the same results.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Dec 08, 2019 Dec 08, 2019

Copy link to clipboard

Copied

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Dec 08, 2019 Dec 08, 2019

Copy link to clipboard

Copied

Thanks but none of the suggested settings helped.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Dec 08, 2019 Dec 08, 2019

Copy link to clipboard

Copied

HDMI? no Display Port? Ok.

 

Might not help performance, but see: https://fstoppers.com/education/one-driver-option-could-be-ruining-your-monitors-accuracy-340911

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Dec 08, 2019 Dec 08, 2019

Copy link to clipboard

Copied

Not on the card. I can try a DVI cable Monday.

Sent from my iPhone

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Dec 08, 2019 Dec 08, 2019

Copy link to clipboard

Copied

Odd, thought that card had DP 1.42, HDMI 2.0b, and Single Link-DVI. Mind you, if your Monitor does not have Display port, then the display port connection on the card is of no use.

 

Would think HDMI would work better than DVI

 

https://www.geforce.com/hardware/desktop-gpus/geforce-gt-1030/specifications

 

Perhaps a sub set of the card.

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Dec 08, 2019 Dec 08, 2019

Copy link to clipboard

Copied

LATEST
DVI-D and HDMI only on the card. The monitor has them all.

Sent from my iPhone

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Dec 08, 2019 Dec 08, 2019

Copy link to clipboard

Copied

Also see: https://photographylife.com/gpu-acceleration-in-lightroom

 

small section on Monitor Connection, talks of diabeling audio for that HDMI for a small improvement

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines