Skip to main content
MadManChan2000
Adobe Employee
Adobe Employee
April 26, 2015
Question

GPU notes for Lightroom CC (2015)

  • April 26, 2015
  • 84 replies
  • 195458 views

Hi everyone,

I wanted to share some additional information regarding GPU support in Lr CC.

Lr can now use graphics processors (GPUs) to accelerate interactive image editing in Develop. A big reason that we started here is the recent development and increased availability of high-res displays, such as 4K and 5K monitors. To give you some numbers: a standard HD screen is 2 megapixels (MP), a MacBook Retina Pro 15" is 5 MP, a 4K display is 8 MP, and a 5K display is a whopping 15 MP. This means on a 4K display we need to render and display 4 times as many pixels as on a standard HD display. Using the GPU can provide a significant speedup (10x or more) on high-res displays. The bigger the screen, the bigger the win.

For example, on my test system with a 4K display, adjusting the White Balance and Exposure sliders in Lightroom 5.7 (without GPU support) is about 5 frames/second -- manageable, but choppy and hard to control. The same sliders in Lightroom 6.0 now run smoothly at 60 FPS.

So why doesn't everything feel faster?

Well, GPUs can be enormously helpful in speeding up many tasks. But they're complex and involve some tradeoffs, which I'd like to take a moment to explain.

First, rewriting software to take full advantage of GPUs is a lot of work and takes time. Especially for software like Lightroom, which offers a rich feature set developed over many years and release versions. So the first tradeoff is that, for this particular version of Lightroom, we weren't able to take advantage of the GPU to speed up everything. Given our limited time, we needed to pick and choose specific areas of Lightroom to optimize. The area that I started with was interactive image editing in Develop, and even then, I didn't manage to speed up everything yet (more on this later).

Second, GPUs are marvelous at high-speed computation, but there's some overhead. For example, it takes time to get data from the main processor (CPU) over to the GPU. In the case of high-res images and big screens, that can take a LOT of time. This means that some operations may actually take longer when using the GPU, such as the time to load the full-resolution image, and the time to switch from one image to another.

Third, GPUs aren't best for everything. For example, decompressing sequential bits of data from a file -- like most raw files, for instance -- sees little to no benefit from a GPU implementation.

Fourth, Lightroom has a sophisticated raw processing pipeline (such as tone mapping HDR images with Highlights and Shadows), and running this efficiently on a GPU requires a fairly powerful GPU. Cards that may work with in the Photoshop app itself may not necessarily work with Lightroom. While cards that are 4 to 5 years old may technically work, they may provide little to no benefit over the regular CPU when processing images in Lr, and in some cases may be slower. Higher-end GPUs from the last 2 to 3 years should work better.

So let's clear up what's currently GPU accelerated in Lr CC and what's not:

First of all, Develop is the only module that currently has GPU acceleration whatsoever. This means that other functions and modules, such as Library, Export, and Quick Develop, do not use the GPU (performance should be the same for those functions regardless of whether you have GPU enabled or disabled in the prefs).

Within Develop, most image editing controls have full GPU acceleration, including the basic and tone panel, panning and zooming, crop and straighten, lens corrections, gradients, and radial filter. Some controls, such as local brush adjustments and spot clone/heal, do not -- at least, not yet.

While the above description may be disappointing to some of you, let's be clear: This is the beginning of the GPU story for Lightroom, not the end. The vision here is to expand our use of the GPU and other technologies over time to improve performance. I know that many photographers have been asking us for improved performance for a long time, and we're trying to respond to that. Please understand this is a big step in that direction, but it's just the first step. The rest of it will take some time.

Summary:

1. GPU support is currently available in Develop only.

2. Most (but not all) Develop controls benefit from GPU acceleration.

3. Using the GPU involves some overhead (there's no free lunch). This may make some operations take longer, such as image-to-image switching or zooming to 1:1. Newer GPUs and computer systems minimize this overhead.

4. The GPU performance improvement in Develop is more noticeable on higher-resolution displays such as 4K. The bigger the display, the bigger the win.

5. Prefer newer GPUs (faster models within the last 3 years). Lightroom may technically work on older GPUs (4 to 5 years old) but likely will not benefit much. At least 1 GB of GPU memory. 2 GB is better.

6. We're currently investigating using GPUs and other technologies to improve performance in Develop and other areas of the app going forward.

The above notes also apply to Camera Raw 9.0 for Photoshop/Bridge CC.

Eric Chan

Camera Raw Engineer

This topic has been closed for replies.

84 replies

RafaelM
Participant
June 2, 2015

Lightroom CC 2015 is a disaster.

Nothing worst than poor performance. How do I go back to Lightroom 5? otherwise I might have to switch all together.

Participating Frequently
June 2, 2015

I think the practical approach is to reinstall LR5.7 and wait for the LR6.1 update that will hopefully fix these issues.  I suspect that we will have to wait until the new haze filter is ready.

I am currently using LR5.7 and will wait for the update before upgrading.

Ian

jwArtWorks
Known Participant
May 19, 2015

Okay, here is some other information that might be helpful.

I have an AMD Radeon 5500 Graphics Card (1 Gig) on an HP workstation with a 3.2Ghz Processor and 16Gig RAM.

I have been getting "Graphics Processor disabled due to errors" and the develop module performance was slightly slower than LR 5.7

I came across this link https://helpx.adobe.com/lightroom/kb/lightroom-amd-graphics-cards.html

After reading this page, I uninstalled the new drivers and reinstalled the prior 14.4 video driver from AMD. I restarted LR CC and no more Graphics Card errors. I tried a few things in the develop module and they really are quite a bit faster but not everything. The brush rendering is fast, the crop tool is slower. Image to image movement in the filmstrip is slower. Changes made in the Basic Panel and the Details Panel are fast. Overall, I think most things are a little slower than LR 5.7 except for those things I mentioned above.

Another curious thing is why Adobe removed the ability to apply develop presets when you select a group of images and then right-click and choose "Develop Settings" can paste from previous but now I can no longer apply one of my user created presets this way. Why remove this?

Inspiring
May 19, 2015

The new LRCC and PSCC updates have really slowed things down.  I have a fast system (system info below) and it runs much slower with the Graphics Card (GTX 650) enabled.  I think it is also a little slower with the graphics card enabled.  Photoshop however is severely crippled. Certain features like healing brush is unusable with OR without the graphics card enabled. How would I back level PS prior to the update that supposedly sped things up? I can live with LRCC with graphics processor disabled but not PS. BTW I also have a wacom tablet but unplugging it does not seem to have any impact.

Lightroom version: CC 2015.0.1 [ 1018573 ]

License: Creative Cloud

Operating system: Windows 8.1 Business Edition

Version: 6.3 [9600]

Application architecture: x64

System architecture: x64

Logical processor count: 8

Processor speed: 4.0 GHz

Built-in memory: 16330.4 MB

Real memory available to Lightroom: 16330.4 MB

Real memory used by Lightroom: 2178.0 MB (13.3%)

Virtual memory used by Lightroom: 2174.3 MB

Memory cache size: 1876.3 MB

Maximum thread count used by Camera Raw: 8

Camera Raw SIMD optimization: SSE2,AVX

System DPI setting: 96 DPI

Desktop composition enabled: Yes

Displays: 1) 1920x1200, 2) 1920x1200

Input types: Multitouch: No, Integrated touch: No, Integrated pen: No, External touch: No, External pen: Yes, Keyboard: No

Graphics Processor Info:

GeForce GTX 650/PCIe/SSE2

Check OpenGL support: Passed

Vendor: NVIDIA Corporation

Version: 3.3.0 NVIDIA 352.86

Renderer: GeForce GTX 650/PCIe/SSE2

LanguageVersion: 3.30 NVIDIA via Cg compiler

Application folder: C:\Program Files\Adobe\Adobe Lightroom

Library Path: C:\Lightroom Main Catalog\Lightroom Catalog.lrcat

Settings Folder: C:\Users\Patrick\AppData\Roaming\Adobe\Lightroom

Installed Plugins:

1) Behance

2) Canon Tether Plugin

3) Facebook

4) Flickr

5) HDR Efex Pro 2

6) Leica Tether Plugin

7) Nikon Tether Plugin

8) Perfect B&&W 8

9) Perfect Effects 8

10) Perfect Enhance 8

11) Perfect Photo Suite 8

12) Perfect Portrait 8

13) Perfect Resize 8

14) rc Publish Service Assistant

15) SmugMug

Config.lua flags: None

Updated Toolkit: Adobe Camera Raw 9.0 for Lightroom 6.0 (build 1014445)

Updated Toolkit: Book Module 6.0 (build 1014445)

Updated Toolkit: Develop Module 6.0 (build 1014445)

Updated Toolkit: Import Module 6.0 (build 1014445)

Updated Toolkit: Library Module 6.0 (build 1014445)

Updated Toolkit: Map Module 6.0 (build 1014445)

Updated Toolkit: Monitor Module 6.0 (build 1014445)

Updated Toolkit: Print Module 6.0 (build 1014445)

Updated Toolkit: Slideshow Module 6.0 (build 1014445)

Updated Toolkit: Web Module 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.AgNetClient 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.AgWFBridge 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.Headlights 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.LibraryToolkit 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.MultiMonitorToolkit 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.archiving_toolkit 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.bridgetalk 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.catalogconverters 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.cef_toolkit 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.coretech_toolkit 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.curculio 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.discburning 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.email 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.export 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.ftpclient 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.help 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.iac 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.imageanalysis 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.layout_module_shared 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.pdf_toolkit 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.sdk 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.sec 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.socket 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.store_provider 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.substrate 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.ui 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.video_toolkit 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.xml 6.0 (build 1014445)

Updated Toolkit: com.adobe.wichitafoundation 6.0 (build 1014445)

Adapter #1: Vendor : 10de

  Device : fc6

  Subsystem : 84281043

  Revision : a1

  Video Memory : 966

Adapter #2: Vendor : 1414

  Device : 8c

  Subsystem : 0

  Revision : 0

  Video Memory : 0

AudioDeviceIOBlockSize: 1024

AudioDeviceName: Speakers (High Definition Audio Device)

AudioDeviceNumberOfChannels: 2

AudioDeviceSampleRate: 44100

Build: LR5x102

Direct2DEnabled: false

GPUDevice: not available

OGLEnabled: true

Participating Frequently
May 19, 2015

There is an old term for this... FUBAR.

Participating Frequently
May 18, 2015

New NVidia driver today installed - Lightroom disabled GPU support immediately. I'm giving up on this program and won't buy it after the Trial expires until this mess is fixed, life is to short.

deejjjaaaa
Inspiring
May 18, 2015

> New NVidia driver today installed - Lightroom disabled GPU support immediately. I'm giving up on this program and won't buy it after the Trial expires until this mess is fixed, life is to short.inst

not sure why do you install "game ready" drivers ? to game ? non gamers need to use production versions unless directed by NVidia to use a specific later version... for example for my config the most recent not "game ready" (read quick fix catering to gamers) driver is

GeForce 341.44 Driver WHQL341.44February 24, 2015


it works and Adobe code works (slow yes - but then we know whom to blame and that's not NVidia in this case)

Participating Frequently
May 18, 2015

Well, it's a driver for The Witcher 3 which I plan to play starting tomorrow. I am pretty sure this driver does not disable OpenGL support so I have no idea why Lightroom is whining again. I tired the last WHQL driver, too in the past and it didn't work. Actually, the reasons why GPU support works or not seems totally random to me.

Oirada
Known Participant
May 18, 2015

This is my experience. I have a 4k monitor (Dell UP2414Q) and used to run it with a Mac Mini Late 2014 (the midrange model) and LR 5.7 and everything was fine. As fast as my 4 years old laptop and Full HD screen. So no particular issue related to 4k (only thing is that the Mini could get 30hz not 60hz but no big deal for photo editing).

Now I replaced the Mac Mini with the latest Macbook Pro 13" (early 2015) and LR 5 with 6, same screen.

Overall it is a big step back in terms of speed, I find ironic that speed is the main selling point for Lr6. Flipping through pictures takes for ever and when you are back from a travel and have 2,000 pics to sort/select/discard that's many hours added to workflow.

Many things are slower while some are just not usable, like the brush for the gradient and graduated filter. Not sure what is the best setting for my hardware. Have to try with GPU disabled.

It took longer than usual to release this version of LR. Not many features added and slower workflow. Hope it will get better soon. In the meanwhile I cannot use LR5 now that my catalog has been migrated to LR6, right?

Community Expert
May 18, 2015

It took longer than usual to release this version of LR. Not many features added and slower workflow. Hope it will get better soon. In the meanwhile I cannot use LR5 now that my catalog has been migrated to LR6, right?

Your LR 5 catalog should still be there and you should be able to use LR 5 just fine with it. When you installed LR 6 it created an upgraded copy of your catalog and left the LR 5 version alone. It just won't have any of the imports and changes you did in 6.

Participating Frequently
May 18, 2015

It takes more than 20 seconds for Lightroom 6 to finish rendering a 16MP raw file on my relatively new, relatively high-end PC. I've tried 16MP files from three different cameras, and the performance is about the same in all 3 cases. I've tried with GPU enabled and disabled, with different preview sizes selected, with a clean install of the latest drivers for my graphics card, all manner of different system graphics settings and combinations, and nothing seems to help. I've even been in contact with NVidia support. They've had me try a couple of different things (clean driver install, change power management setting), also to no avail.

I'm running a 64 bit Win 8.1 PC with a 6-core Haswell-E i7 (5280K) processor, an NVidia GTX970, 16GB of RAM, OS and applications on an SSD. I have two monitors, one is 4K, the other 2K. I've disabled the 2K monitor, but that doesn't help. At best, after messing around with settings, I may get better performance for a few minutes, but things slow to a crawl again in no time.

As I mentioned, it takes more than 20 seconds (often more than 25 seconds) to finish rendering a 16MP file in the develop mode when the develop window is on my 4K monitor. When I move the development window to my 2K monitor, it still takes around 15 seconds to finish rendering an image. From time to time I have also seen the image load up completely sharp, then go blurry after a couple of seconds and take another 20 or so seconds to finish rendering again.

The performance is also unbearably slow when zooming into an image, as well as when making a number of different edits. After editing or zooming (develop mode on 4K monitor), the application stutters, sometimes goes completely black and sometimes flashes an image I had loaded up previously before displaying the image I was just working on. Various changes take forever to display - adjusting WB, for instance, sometimes takes multiple seconds before the adjustments are shown. When backing out a WB change (ctl+Z), I often see the image cycle back to the previous WB (correct) then cycle forward to the just-undone WB (incorrect), then cycle back to the previous WB again.

Certain adjustments cause parts of an image to appear strangely blurry for a few seconds before sharpening up again - imagine looking at an image of a fir tree that's sharp, then, all of a sudden, the fir tree looks more like deep green algae in a pond for a few seconds before sharpening back up - a bizarre effect I've never seen before.

In short, the application is all but unusable for me. I'd go back to 5.7, but Adobe hasn't added support for the Olympus E-M5 II, which I've started using recently, so I'm stuck either waiting for Adobe to fix this performance issue or shopping around for another solution. I've been using Lightroom since version 4, and I'd like to stick with it, but it's simply unworkably slow on my modern PC at the moment.

Participating Frequently
May 18, 2015

I have some good news to report. At least right now, I'm seeing much better performance from LR6 with the same hardware and files as before.

I had been considering overclocking my system, but hadn't gotten around to it. My motherboard and graphics card are both from MSI, and the PC came with a bunch of MSI-specific overclocking software. I had opened it up shortly after first having received my PC in January, but had decided to hold off on overclocking for a bit.

Today I decided to take a look at the overclocking options again. I started out with the GPU. I used the MSI Afterburner application to create two new overclocking profiles - a sort of medium profile and a more extreme one, based off of a tutorial I read online. I tested LR6 with the extreme profile, and files that had taken 25 seconds to render previously were now rendering in 5-8 seconds. Not instantaneous, but a real improvement, and numbers that I can live with.

The interesting part, however, is that I re-enabled the default profile for my GPU (not overclocked) and tested LR6 again. The performance was almost as good as it was with the extreme overclocking profile enabled. Maybe a second or two slower to render an image than when the extreme profile was enabled, but nowhere near as slow as it had been previously.

I went ahead and took the opportunity to turn on the MSI Overclock Genie (a hardware-switch-enabled automatic CPU overclocker on my MSI motherboard), as well. It did boost the system's performance (3.3Mhz to 3.8Mhz on my 6 CPU cores), but I'm not sure whether there was really a noticeable improvement in LR6 performance.

It seems that there was something awry with the configuration of my GTX 970 that was reset when I overclocked the GPU, and it seems that that something remains more or less fixed even after "de-overclocking" my GPU. Something to consider for those who are seeing sluggish performance from LR6 with higher-end graphics cards.

Participating Frequently
May 18, 2015

Good for you! I'm using much high end cards and computers and still performance is horrendous.

Participating Frequently
May 15, 2015

Okay, I did some magic. I don't why but GPU acceleration does work now and it' multiple times faster after first tests(cropping, treatment sliders, etc. )

What I did:

While on Nvidia driver 344.x

- Disabled the Full HD TV and the Full HD Monitor, leaving only the UHD monitor. No effect

- trying in the 3d settings to set to "single display performance mode instead of multiple. No effect.

- enabling the disabled TV and secondary monitor.

Updating Geforce experience and graphics driver to the latest version (350.12) -> after that the multiple display performance mode was activated automatically. Now Lightroom uses the GPU! I performed similar steps before and it didn't help then, I don't know what exactly helped this time but it works and I will not touch anything now .

I have a huge performance boost in UHD with the GPU enabled.

Participating Frequently
May 15, 2015

P.S.

Graphics Processor Info:

GeForce GTX 970/PCIe/SSE2

Check OpenGL support: Passed

Vendor: NVIDIA Corporation

Version: 3.3.0 NVIDIA 350.12

Renderer: GeForce GTX 970/PCIe/SSE2

LanguageVersion: 3.30 NVIDIA via Cg compiler

Known Participant
May 15, 2015

Del_Pierro,

I`m glad, it worked for you!

conroyt45142487
Participant
May 15, 2015

I don't often post anywhere, but I did some extensive tests and I hope you all find this useful information on my real world experience and testing on a higher end rig / GPU for LR6

SPECS (skip if you don't care for it):

LR6 64 bit

Windows 7 Business X64

Processor speed: INTEL i7 3.9 GHz

Built-in memory: 32675.7 MB (RAM)

Real memory used by Lightroom: 1468.1 MB (4.4%)

2x DELL 2560x1600 (u3014 model)

2x GeForce GTX 980/PCIe/SSE2 in SLI (meaning both cards are piping max 3d performance to primary screen although I do not know how LR6 handles SLi)

GFX turned on VS CPU only processing

-More load times for images VS quicker loads

-INITIAL brushes: stuttering for the first brush strokes, delay in showing the effects VS instant brushing and application of effects

-As I apply the first brush strokes like a dodge/burn in continuous strokes of the first brush:  effects catch up to almost real time with just a slight delay VS consistently instant brushing and application of effects

-As I add NEW brush strokes/graduated filters/effects on top of existing ones: effects are now instantaneous with no lag or delay VS effects are now showing delayed effect

-As I really do extensive brushes etc (tested to almost 50 effects/brushes/sliders) over the entire image: effects are still instantaneous and lighting fast VS effects that show noticeable delay

Conclusion:

With my higher end rig, GFX turned on has many cons during the initial stages of editing and there is noticeable lag.  CPU only is fast during the initial stages.  As you do more edits, even after 5-7 different brush strokes applications, you see the speed difference.  GFX assisted does not stutter and remains fast/instantaneous despite how many edits I throw at it.  CPU starts to slow down around 5-7 brush strokes applied to the entire image.  As it goes into a super intensive (almost unrealistic edit), GFX remains fast while CPU has delays.

My opinions (coming from a wedding photographer perspective):

Will I use GFX assisted?  For now, yes, but it is annoying with just small edits or mass culling and/or basic editing of a wedding shoot (~6000 images)

Best use if you can remember: GFX off during culling and basic global adjustments.  GFX on during select edits and when you really get into it but don't want to open up PS.  However, it's not for me and I had having to go in and out of menus to enable/disable.

If Adobe can minimize or eliminate the initial lag/delay/overhead that the GFX assisted causes, then you have one hell of an engine.  I am excited to see how adobe develops this tech and hopefully it'll be patched over soon to perform better.  If Adobe can fix this issue and carry it over into PS, it would be even more awesome and I would love the increased speed when you are heavily editing an image.

Should you go out and buy or upgrade your GFX card?  HELL NO.  At least not in Adobe's current iteration of it's software.  If Adobe can make the initial stages of GFX on as fast as CPU only, then Yes, go get a GFX card.  But until Adobe can prove they can fix the issue, CPU only will be enough for the casual user confined to LR.

CAVEAT:  My dual graphics card in Sli were purchased only 6 months ago and sat at the #2 GFX card available.  So although I did 50 brush effects/changes applied over the entire image overlapping each other as fast as I could and I did NOT feel any slow down, your experience may vary with a lower end card.  or not.  I tried to monitor my GPU usage through a software application, but it wasn't showing any changes in GPU use % when I did the edits.  However, I did hear my GPU's fans spool up as I did the edits and slow down when I didn't apply effects so I know they were working. (My cards aren't quiet as they each have 3 fans directly cooling the cards).

Hope this helps everyone!  Hit LIKE below so other people searching can find this helpful.

www.conroytam.com

Keywords: Graphic acceleration lightroom CC, lightroom 6, LR6, LRCC, graphic processor acceleration lightroom, photoshop CS6

Participating Frequently
May 15, 2015

It's frustrating. I have an UHD monitor and a Geforce GTX 970, Lightroom feels like it runs on a C-64, it's unusable without gpu acceleration at such a high reoslution. I tried different drivers and disabled the second or third  monitors, no luck at all. Need to choose a different product for now because I can't work like that.

Known Participant
May 15, 2015

del-pierro, is it also "unusable" with GPU enabled? Was LR 5 faster with the same monitors?

Participating Frequently
May 15, 2015

‌I can't enable gpu acceleration, that is The problem. I would love to try it but I can't find a way to make it work, it's always disabled due to an error. LR 5 is also very Slow in UHD. In Full HD everything is fine.

Participating Frequently
May 15, 2015

Mr. Chan,

Thank you for the well thought out response. Not from the aspect of explaining, cus' I am not into all that techno crap. I am a photographer and don't care.

Light room is a nice program, or was anyway. The truth is Adobe has sacrificed (or retarded) some short term performance of the software in the interest of future capabilities. So, please don't try to resurrect a corpse by blowing smoke up its backside. Tell us (your customers) the plain truth in small words so we can accept it and move on.

But... please work on getting back up to speed.