Skip to main content
MadManChan2000
Adobe Employee
Adobe Employee
April 26, 2015
Question

GPU notes for Lightroom CC (2015)

  • April 26, 2015
  • 84 replies
  • 195458 views

Hi everyone,

I wanted to share some additional information regarding GPU support in Lr CC.

Lr can now use graphics processors (GPUs) to accelerate interactive image editing in Develop. A big reason that we started here is the recent development and increased availability of high-res displays, such as 4K and 5K monitors. To give you some numbers: a standard HD screen is 2 megapixels (MP), a MacBook Retina Pro 15" is 5 MP, a 4K display is 8 MP, and a 5K display is a whopping 15 MP. This means on a 4K display we need to render and display 4 times as many pixels as on a standard HD display. Using the GPU can provide a significant speedup (10x or more) on high-res displays. The bigger the screen, the bigger the win.

For example, on my test system with a 4K display, adjusting the White Balance and Exposure sliders in Lightroom 5.7 (without GPU support) is about 5 frames/second -- manageable, but choppy and hard to control. The same sliders in Lightroom 6.0 now run smoothly at 60 FPS.

So why doesn't everything feel faster?

Well, GPUs can be enormously helpful in speeding up many tasks. But they're complex and involve some tradeoffs, which I'd like to take a moment to explain.

First, rewriting software to take full advantage of GPUs is a lot of work and takes time. Especially for software like Lightroom, which offers a rich feature set developed over many years and release versions. So the first tradeoff is that, for this particular version of Lightroom, we weren't able to take advantage of the GPU to speed up everything. Given our limited time, we needed to pick and choose specific areas of Lightroom to optimize. The area that I started with was interactive image editing in Develop, and even then, I didn't manage to speed up everything yet (more on this later).

Second, GPUs are marvelous at high-speed computation, but there's some overhead. For example, it takes time to get data from the main processor (CPU) over to the GPU. In the case of high-res images and big screens, that can take a LOT of time. This means that some operations may actually take longer when using the GPU, such as the time to load the full-resolution image, and the time to switch from one image to another.

Third, GPUs aren't best for everything. For example, decompressing sequential bits of data from a file -- like most raw files, for instance -- sees little to no benefit from a GPU implementation.

Fourth, Lightroom has a sophisticated raw processing pipeline (such as tone mapping HDR images with Highlights and Shadows), and running this efficiently on a GPU requires a fairly powerful GPU. Cards that may work with in the Photoshop app itself may not necessarily work with Lightroom. While cards that are 4 to 5 years old may technically work, they may provide little to no benefit over the regular CPU when processing images in Lr, and in some cases may be slower. Higher-end GPUs from the last 2 to 3 years should work better.

So let's clear up what's currently GPU accelerated in Lr CC and what's not:

First of all, Develop is the only module that currently has GPU acceleration whatsoever. This means that other functions and modules, such as Library, Export, and Quick Develop, do not use the GPU (performance should be the same for those functions regardless of whether you have GPU enabled or disabled in the prefs).

Within Develop, most image editing controls have full GPU acceleration, including the basic and tone panel, panning and zooming, crop and straighten, lens corrections, gradients, and radial filter. Some controls, such as local brush adjustments and spot clone/heal, do not -- at least, not yet.

While the above description may be disappointing to some of you, let's be clear: This is the beginning of the GPU story for Lightroom, not the end. The vision here is to expand our use of the GPU and other technologies over time to improve performance. I know that many photographers have been asking us for improved performance for a long time, and we're trying to respond to that. Please understand this is a big step in that direction, but it's just the first step. The rest of it will take some time.

Summary:

1. GPU support is currently available in Develop only.

2. Most (but not all) Develop controls benefit from GPU acceleration.

3. Using the GPU involves some overhead (there's no free lunch). This may make some operations take longer, such as image-to-image switching or zooming to 1:1. Newer GPUs and computer systems minimize this overhead.

4. The GPU performance improvement in Develop is more noticeable on higher-resolution displays such as 4K. The bigger the display, the bigger the win.

5. Prefer newer GPUs (faster models within the last 3 years). Lightroom may technically work on older GPUs (4 to 5 years old) but likely will not benefit much. At least 1 GB of GPU memory. 2 GB is better.

6. We're currently investigating using GPUs and other technologies to improve performance in Develop and other areas of the app going forward.

The above notes also apply to Camera Raw 9.0 for Photoshop/Bridge CC.

Eric Chan

Camera Raw Engineer

This topic has been closed for replies.

84 replies

Known Participant
October 19, 2015

Super slow here on OS X 10.9.5 MacPro 2x2,66 GHz &Core with 24 GB Ram and Nvidia Quaddro 4000. I removed the app and installed Version 5, which works super fine. I also removed AE CC 2015 cause of little bugs and am back on CC 2014. So what are people doing at Adobe?

Known Participant
October 19, 2015

Have you tried the latest LR 2015? Most of the problems I had are gone in this version, other than the new horrible import menu that's going away anyway. We use After Effects 2015 day out at our shop, 6 machines, no issues, only minor stuff that was solved by updating the GPU drivers.

Sent from the Moon

Known Participant
October 19, 2015

‌is there an update since yesterday? Yesterday i removed the app. I will check again. I had all new updates installed. I still have this problems with memory in ae cc 2015, it anounxes a super large number of ram for a frame that doesnt exists. I was and am in project where a buggy software is super annoing when your client waits for results... Cc 2014 works fine actually. Cs6 is very stable. Many companies still use it!

Sean H [Seattle branch]
Known Participant
October 17, 2015

I'm on a 4790K i7 CPU clocked at 4.7GHz locked on all 4 cores. I'm also running a 280X GPU. DNG photography is from Nikon D750 (24MP, 14-bit)

I have a new LG 27MU67 4k IPS screen and 10.10.4 OS X. During culling in Library and sifting through 1:1 renders, it's OK. Pretty slow but OK. In Develop, without GPU, it's unusable at 4k/3k(emulation). With GPU, it's worse and the photos flash between applied/unapplied edit settings as I move the sliders. The ONLY way I can use LR now is to emulate 1504x846 resolution in OS X which makes everything huge. If I'm emulating 2560x1440 or higher, it's not really usable... all of my actions are delayed by a few seconds.

Are there plans to lower the resolution of images in Develop so that editing is possible while the app UI is normal size?

[ ◉"]
ewood007
Participating Frequently
September 21, 2015

I'd like to use a GPU but I don't even know how you guys enabled this in LR 5.7.1 I don't show a performance tab.

Re: 5.7.1 performance tab

seems like most people think you need 6.0 to use a GPU

Known Participant
September 21, 2015

5.7.1 does not have that option.

Sent from the Moon

Known Participant
October 8, 2015

credit where is deserved....   just installed today's update, and works MUCH better with GPU ON, its actually usable!!!!

Participant
September 5, 2015

I'm having one heck of a dilemma here.

I recently purchased the canon 5dsr as a bride/groom portrait only camera body, I discovered that I needed to upgrade to LR 6 or CC in order to edit these images. Well, with doing so, everything slowed massively down. I assumed it was due to the file size of the Canon 5dsr images. But, then checking my Mark III images, they were equally as slow.

Well, I figured, why the heck not, let's go crazy and purchase the new iMac Retina 5k at 3.5ghz 32gigs of ram and the 1tb fusion drive...  SO now what's the issue? I can't seem to speed up CC for the life of me, I'm getting frustrated beyond belief that I have spent $7000 to lose edit time and still not have a solution.

Adobe, I don't understand the point of an "UPGRADE" if it's going to kill me when I edit. The blur between images, the loading time, the lag with the bars, it's exhausting.

What am I doing wrong? Everything is top of the line and I can't edit worth a dink vs LR 5.7 (which I loved)

So please, please, help me here and tell me how to optimize cc 2015 in order to get the best bang for my buck. I kind of deserve it.

Victoria Bampton LR Queen
Community Expert
Community Expert
September 5, 2015

joshuam4396817 wrote:

I can't seem to speed up CC for the life of me

We'll need more specifics.  WHAT is slower?  Where is the catalog stored?  Where are the images stored?  Is GPU enabled in Prefs > Performance or not?  What settings are you applying by default?

Victoria - The Lightroom Queen
Participant
September 5, 2015

Victoria,

I have tried everything. Lightroom is slower by 35% if I were to put a number on it. I'm talking about editing. It's a brand new iMac, I have attempted with loading one file from my canon mark iii and one file from canon 5dsr - both are almost equally as slow. GPU has been disabled and enabled, I have changed preview sizes and quality, I have changed the storage location of the files to the desktop with both the cache. I have tried build smart preview over not. I have updated everything.

CC is the problem. I went from Lightroom 5.7 and because of this move because they don't make the DNG/ACR file that is needed for 5.7 and I am forced to upgrade to lightroom 6 or CC in order to edit the camera raw images from the canon 5dsr I am at a complete loss. I have no idea what to do to solve the problem.

It can take anywhere from 4-6 seconds to transition between images, after making an edit with the slider, such as the color, instead of it gliding smoothly, I have to click where I want it and then wait for an additional 3 - 5 seconds. When I make changes with any of the other "basic" sliders, it takes about 2 seconds per transition. When I make changes with tone curve, it takes 3-5 seconds. These are seconds, that I honestly don't have the time to wait for.


CC is causing is doubling my work flow time. It used to take me 4 hours to edit 500 images, now it's pushing past 8. This is probably the most ridiculous "upgrade" i have ever seen. I feel CC shouldn't even be out yet if these problems exist. It feels more like a cheap program that Adobe wanted to roll out and get subscribers on with the promise of fixing these things down the road.

GPU is a joke, especially when I have this supposed mega computer that I can't even use for it because it's a 5k. But it was made for a 5k. So, now I'm forced to return the 5k, keep my late 2013 model, re-load lightroom 5.7 for my mark iii images and figure out a different route for the canon 5dsr. Because CC just doesn't cut it for the working full time professional.

Any solutions will help, but I have tried everything.

Participant
July 30, 2015

Lightroom is far away to be fully compatible with the GPU, as Aperture use to do; it uses almost 100% of the VRAM and the GPU but Lightroom only for some silly tasks like zooming. The Adobe Enginering Chan or Chen, said in 2015 (and also, many years ago) thhe same speech: hardware isn't prepared yet. There is logic on that if you read the full speech, but I frankly give a shit logics at this moment.

Participant
July 21, 2015

Hi Chan, I already red all that stuff few years ago. I'm frankly quite annoyed about this partial GPU compatibility to still read the same stuff. Lot of work, yes, that's what makes a program, not magic. What are Adobe People waiting for.

Thanks,

س

Participant
July 11, 2015

Eric,

Is one Manufacturer better than another?

I have a Radeon R9 8 GB DDR5 and 2 GPU's. I have seen some increase in  performance however I did expect to see more. My Laptop has 4 GB DDR5 using NVidia GPU and the speed difference is awesome with a single GPU.

Participant
July 9, 2015

‌I'm not sure if anyone else has had issues with the latest AMD driver update (7/8/2015) I installed it and began having issues with Lightroom "Not Responding" and would have to exit the program, I ended up rolling back to the previous driver version from 2014 and it seems to be back to normal, just FYI

Brian Gray
Participant
July 12, 2015

I've just installed video driver 15.2 and now lightroom fails every time.  The gpu acceleration passes it startup test and can be seen in the preformance tab as being operational but the rest of the program won't work.  I can't switch modules, or open a library or view a different image or anything.  I can startup lightroom but as soon as I attempt to use anything - the whole thing hangs and has to be shut down.

Do not attempt to use the updated driver!  I'm running a  Radeon 6800 by the way.

Participating Frequently
July 6, 2015

I keep seeing this talk of how with high end machines the performance is just great, and how using the GPU checkbox is only for 4k monitors - I specifically bought the top tier Alienware Area 51 machine with:

Intel® Core™ i7-5960X (8-cores, 20MB Cache, Overclocked up to 4.0 GHz w/ Turbo Boost)

32GB Quad Channel DDR4 at 2133MHz

Triple NVIDIA® GeForce® GTX 980 graphics with 12GB total (3x 4GB) GDDR5 -NVIDIA SLI® Enabled

and an LG 31MU97-B: 31 Inch 4K IPS Monitor

and everything from generating previews to brushes, to general adjustments is ridiculously (6-8 seconds per preview) slow.  Even after rendering standard and 1:1 previews, there is a 3-4 second lag while browsing in either develop or library modules.  I've used Lightroom since version 3.1, so I'm well aware of what the performance SHOULD be like even on a modest machine.  The performance, even on a beast of a machine, is utterly unacceptable. I'm shocked we haven't actually seen these performance issues officially acknowledged, let alone addressed.

Eggz
Participating Frequently
July 6, 2015

That's really strange!  Mine works just fine with last year's top-of-the-line stuff. 

I don't, however, have SLI enabled.  Maybe see what happens if you disable SLI in Nvidia Control Panel.  I know that's caused issues on other Adobe programs in the past, so perhaps it is doing the same with Lightroom. 

Participating Frequently
July 6, 2015

Ah, I'm jealous, man!  Yeah I thought the same, so I started out using disabled SLI and had near identical results.  Seems like whether I use one card or 3, same issues with general lagginess.  Nothing more frustrating than trying to develop and cull a wedding of 2500 images when it takes 6-8 seconds per preview....Thank god for lightroom mobile, at least the culling can be done at reasonable speeds through there.

Eggz
Participating Frequently
July 6, 2015

This version of Lightroom will stress your hardware more than others did in the past, but that's a good thing for the produce (not necessarily for some current users). 

It means the product is beginning to utilize high-end hardware better than it did before.  The bad news is that it isn't totally efficient yet, so you'll need a fairly high-end computer with a beefy CPU and GPU in order to avoid the slow downs people are talking about. 

Those with high-end hardware will notice that Lightroom CC is actually much smoother than previous versions.  But for people who don't have the compute power to reach the lever where this new version of LR is buttery smooth, their "slower" computers will actually run the program slower than it did before.  I say "slower" because the program can run slowly on a machine that would otherwise be considered very capable, though it just falls short of what LR's new GPU acceleration now demands for optimal performance.  See the video posted below.

The video is on a beefy machine.  I've also ran it on a 2015 Razer Blade laptop, which is very fast.  It's pretty smooth, but certain tasks (e.g. preview generation) are notably slower.  That hasn't detracted, however, from the smoother operation on high-end machines.  It just takes a while to do export-like functions. 

Basically, if you want to run LR CC very well, get a 6- or 8-core Intel CPU from the last year or two, and pair it with a mid-tier (or better) GPU from the last year or two.  Plus, toss as much RAM as you can at the program (at least 16 GB), especially if you're going to be running PS and LR together at the same time (which is likely). 

Known Participant
July 6, 2015

Hmm interesting ...will Adobe be bundling extra fast computers with this version oi wonders?

Mac OS Sonoma, 64 GB RAM 27" 2019 iMac 3.6 GHz 8-Core Intel Core i9
Participating Frequently
July 6, 2015

It is part of the latest subscription program. With only $3k down and $600 per month you can sign up for the "New Computer per year program".... does not include CC.