• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
12

GPU notes for Lightroom CC (2015)

Adobe Employee ,
Apr 25, 2015 Apr 25, 2015

Copy link to clipboard

Copied

Hi everyone,

I wanted to share some additional information regarding GPU support in Lr CC.

Lr can now use graphics processors (GPUs) to accelerate interactive image editing in Develop. A big reason that we started here is the recent development and increased availability of high-res displays, such as 4K and 5K monitors. To give you some numbers: a standard HD screen is 2 megapixels (MP), a MacBook Retina Pro 15" is 5 MP, a 4K display is 8 MP, and a 5K display is a whopping 15 MP. This means on a 4K display we need to render and display 4 times as many pixels as on a standard HD display. Using the GPU can provide a significant speedup (10x or more) on high-res displays. The bigger the screen, the bigger the win.

For example, on my test system with a 4K display, adjusting the White Balance and Exposure sliders in Lightroom 5.7 (without GPU support) is about 5 frames/second -- manageable, but choppy and hard to control. The same sliders in Lightroom 6.0 now run smoothly at 60 FPS.

So why doesn't everything feel faster?

Well, GPUs can be enormously helpful in speeding up many tasks. But they're complex and involve some tradeoffs, which I'd like to take a moment to explain.

First, rewriting software to take full advantage of GPUs is a lot of work and takes time. Especially for software like Lightroom, which offers a rich feature set developed over many years and release versions. So the first tradeoff is that, for this particular version of Lightroom, we weren't able to take advantage of the GPU to speed up everything. Given our limited time, we needed to pick and choose specific areas of Lightroom to optimize. The area that I started with was interactive image editing in Develop, and even then, I didn't manage to speed up everything yet (more on this later).

Second, GPUs are marvelous at high-speed computation, but there's some overhead. For example, it takes time to get data from the main processor (CPU) over to the GPU. In the case of high-res images and big screens, that can take a LOT of time. This means that some operations may actually take longer when using the GPU, such as the time to load the full-resolution image, and the time to switch from one image to another.

Third, GPUs aren't best for everything. For example, decompressing sequential bits of data from a file -- like most raw files, for instance -- sees little to no benefit from a GPU implementation.

Fourth, Lightroom has a sophisticated raw processing pipeline (such as tone mapping HDR images with Highlights and Shadows), and running this efficiently on a GPU requires a fairly powerful GPU. Cards that may work with in the Photoshop app itself may not necessarily work with Lightroom. While cards that are 4 to 5 years old may technically work, they may provide little to no benefit over the regular CPU when processing images in Lr, and in some cases may be slower. Higher-end GPUs from the last 2 to 3 years should work better.

So let's clear up what's currently GPU accelerated in Lr CC and what's not:

First of all, Develop is the only module that currently has GPU acceleration whatsoever. This means that other functions and modules, such as Library, Export, and Quick Develop, do not use the GPU (performance should be the same for those functions regardless of whether you have GPU enabled or disabled in the prefs).

Within Develop, most image editing controls have full GPU acceleration, including the basic and tone panel, panning and zooming, crop and straighten, lens corrections, gradients, and radial filter. Some controls, such as local brush adjustments and spot clone/heal, do not -- at least, not yet.

While the above description may be disappointing to some of you, let's be clear: This is the beginning of the GPU story for Lightroom, not the end. The vision here is to expand our use of the GPU and other technologies over time to improve performance. I know that many photographers have been asking us for improved performance for a long time, and we're trying to respond to that. Please understand this is a big step in that direction, but it's just the first step. The rest of it will take some time.

Summary:

1. GPU support is currently available in Develop only.

2. Most (but not all) Develop controls benefit from GPU acceleration.

3. Using the GPU involves some overhead (there's no free lunch). This may make some operations take longer, such as image-to-image switching or zooming to 1:1. Newer GPUs and computer systems minimize this overhead.

4. The GPU performance improvement in Develop is more noticeable on higher-resolution displays such as 4K. The bigger the display, the bigger the win.

5. Prefer newer GPUs (faster models within the last 3 years). Lightroom may technically work on older GPUs (4 to 5 years old) but likely will not benefit much. At least 1 GB of GPU memory. 2 GB is better.

6. We're currently investigating using GPUs and other technologies to improve performance in Develop and other areas of the app going forward.

The above notes also apply to Camera Raw 9.0 for Photoshop/Bridge CC.

Eric Chan

Camera Raw Engineer

Views

176.6K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
replies 353 Replies 353
Community Beginner ,
Jun 30, 2015 Jun 30, 2015

Copy link to clipboard

Copied

sorry I missed that bit but then, there does seam to be numerous other things slowing down too.... issue been, with a super fast machine we shouldnt expect to see this happen should we really???

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jun 30, 2015 Jun 30, 2015

Copy link to clipboard

Copied

Correct, but the USE of GPU make it really worst… and even with GPU off is unusable.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Jun 30, 2015 Jun 30, 2015

Copy link to clipboard

Copied

Philip, using the GPU acceleration is known to slow down the brushes enormously. With your machine you should get butter smooth brush performance when you turn off the GPU acceleration. I use a Mac Book Pro retina from three years ago and it is very smooth in brushes on large images. With the GPU enabled it is unusable. This is true for 24 MP raws but especially so on the 200MP+ stitches I often work with. The GPU acceleration makes the program extremely frustrating to work with. Turning it off makes it as fast as LR 5. In fact I think this option should probably be off by default on almost all machines as it only speeds up a few things and actually considerably slows down many others. This is true even for very new machines with high end graphics cards.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
Jun 30, 2015 Jun 30, 2015

Copy link to clipboard

Copied

If you use sharpening or other CPU intensive things on your brushes with GPU off is much slower than LR 5.7.1 on every single machine I've tried side by side. Specially after you start accumulating some brushes, it gets dead dog slow.

Sent from the Moon

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Jun 30, 2015 Jun 30, 2015

Copy link to clipboard

Copied

I haven't noticed a difference. Exports are very slow and dng conversions are slow in 6/CC but the Develop module is basically the same as 5.7. With the GPU enabled, brushes are unusable. Don't know what's the difference between your and my machine.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Jun 30, 2015 Jun 30, 2015

Copy link to clipboard

Copied

On my Windows 7 system with an i7-860 quad-core processor LR CC 2015 is just as speedy as LR5.7 including Exports. No complaints here with the GPU disabled!

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jun 30, 2015 Jun 30, 2015

Copy link to clipboard

Copied

I don't have complained either, until I do some serious retouching. That is were it gets impossible.

Sent from the Moon

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jul 06, 2015 Jul 06, 2015

Copy link to clipboard

Copied

It should be butter smooth but its not.... its like a snail...

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Jul 06, 2015 Jul 06, 2015

Copy link to clipboard

Copied

In the past with threads shorter than this one, the annointed Adobe gurus have stepped in to calm the petulant mob.

Adobe and their minions are suspiciously quiet.

I think they are like Japanese camera companies (Nikon) and unwilling to admit fault. Even the start of this thread was only an announcement that the entire thing is planned.  REALY they planned to make there software  S l o o o o w w  e  r  r   r    r ?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jul 07, 2015 Jul 07, 2015

Copy link to clipboard

Copied

It seems to be a trend. Even the new Photoshop CC 2015 is slower, gallery blur, pan, zoom, respond slower than in the 2014 version on the same computer.

Sent from the Moon

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jul 06, 2015 Jul 06, 2015

Copy link to clipboard

Copied

This version of Lightroom will stress your hardware more than others did in the past, but that's a good thing for the produce (not necessarily for some current users). 

It means the product is beginning to utilize high-end hardware better than it did before.  The bad news is that it isn't totally efficient yet, so you'll need a fairly high-end computer with a beefy CPU and GPU in order to avoid the slow downs people are talking about. 

Those with high-end hardware will notice that Lightroom CC is actually much smoother than previous versions.  But for people who don't have the compute power to reach the lever where this new version of LR is buttery smooth, their "slower" computers will actually run the program slower than it did before.  I say "slower" because the program can run slowly on a machine that would otherwise be considered very capable, though it just falls short of what LR's new GPU acceleration now demands for optimal performance.  See the video posted below.

The video is on a beefy machine.  I've also ran it on a 2015 Razer Blade laptop, which is very fast.  It's pretty smooth, but certain tasks (e.g. preview generation) are notably slower.  That hasn't detracted, however, from the smoother operation on high-end machines.  It just takes a while to do export-like functions. 

Basically, if you want to run LR CC very well, get a 6- or 8-core Intel CPU from the last year or two, and pair it with a mid-tier (or better) GPU from the last year or two.  Plus, toss as much RAM as you can at the program (at least 16 GB), especially if you're going to be running PS and LR together at the same time (which is likely). 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Jul 06, 2015 Jul 06, 2015

Copy link to clipboard

Copied

Hmm interesting ...will Adobe be bundling extra fast computers with this version oi wonders?

Mac OS Sonoma, 64 GB RAM 27" 2019 iMac 3.6 GHz 8-Core Intel Core i9

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Jul 06, 2015 Jul 06, 2015

Copy link to clipboard

Copied

It is part of the latest subscription program. With only $3k down and $600 per month you can sign up for the "New Computer per year program".... does not include CC.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Jul 06, 2015 Jul 06, 2015

Copy link to clipboard

Copied

I keep seeing this talk of how with high end machines the performance is just great, and how using the GPU checkbox is only for 4k monitors - I specifically bought the top tier Alienware Area 51 machine with:

Intel® Core™ i7-5960X (8-cores, 20MB Cache, Overclocked up to 4.0 GHz w/ Turbo Boost)

32GB Quad Channel DDR4 at 2133MHz

Triple NVIDIA® GeForce® GTX 980 graphics with 12GB total (3x 4GB) GDDR5 -NVIDIA SLI® Enabled

and an LG 31MU97-B: 31 Inch 4K IPS Monitor

and everything from generating previews to brushes, to general adjustments is ridiculously (6-8 seconds per preview) slow.  Even after rendering standard and 1:1 previews, there is a 3-4 second lag while browsing in either develop or library modules.  I've used Lightroom since version 3.1, so I'm well aware of what the performance SHOULD be like even on a modest machine.  The performance, even on a beast of a machine, is utterly unacceptable. I'm shocked we haven't actually seen these performance issues officially acknowledged, let alone addressed.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jul 06, 2015 Jul 06, 2015

Copy link to clipboard

Copied

That's really strange!  Mine works just fine with last year's top-of-the-line stuff. 

I don't, however, have SLI enabled.  Maybe see what happens if you disable SLI in Nvidia Control Panel.  I know that's caused issues on other Adobe programs in the past, so perhaps it is doing the same with Lightroom. 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Jul 06, 2015 Jul 06, 2015

Copy link to clipboard

Copied

Ah, I'm jealous, man!  Yeah I thought the same, so I started out using disabled SLI and had near identical results.  Seems like whether I use one card or 3, same issues with general lagginess.  Nothing more frustrating than trying to develop and cull a wedding of 2500 images when it takes 6-8 seconds per preview....Thank god for lightroom mobile, at least the culling can be done at reasonable speeds through there.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jul 07, 2015 Jul 07, 2015

Copy link to clipboard

Copied

There's definitely something wrong with your setup if that's going on.  Even if the 980 doesn't do Lightroom tasks as fast as the 780 ti (which it should), there is still more than enough power to do the tasks smoothly. 

I would disable GPU acceleration to see if things change.  Having 3 high-end GPUs is an exotic configuration, and Lightroom isn't designed with exotic configurations in mind.  If there is an improvement from disabling GPU acceleration, first, try DDU (link will start download automatically): http://www.guru3d.com/files-get/display-driver-uninstaller-download,20.html

It will take you to safe mode.  Just let it do its thing, and then reinstall Nvidia drivers through GeForce Experience: GeForce Experience Tweaks your Game Settings Automatically | GeForce

If it doesn't fit on your screen because DDU removed the graphics drivers, just press "Windows+Up" to force fit it.  Then install the drivers from there.  Do a "Fresh" installation, and then restart. 

If that doesn't help, and it still runs better with GPU acceleration disabled, then try physically removing the two cards furthest from your CPU, keeping only the one closest to your CPU.  Reboot and give it a go. 

See what happens with those things.  There is no way it should be 3-5 seconds for navigating images.  Mine is instant, and my CPU and GPU (just one) are slightly slower than yours.

You might also want to check your "Standard" preview settings.  Also, when you import, do you have it automatically render standard previews?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Jul 07, 2015 Jul 07, 2015

Copy link to clipboard

Copied

Thanks Eggz,  I'll give those steps a try and report back with my findings.  Standard preview settings are set to auto - 4096 pixels i believe, and when I import i render 1:1 and manually render standard as well - I make sure everything is pre-rendered before browsing and culling - expected this set up to be liquid smooth.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Jul 21, 2015 Jul 21, 2015

Copy link to clipboard

Copied

Just to update, I've tried all your suggestions and sadly nothing has resulted in any performance improvements.  I'm convinced that the issue is with the Lightroom software itself, as no other version of lightroom has even been so lethargic.  My current workflow involves importing all of my images in Lightroom and doing all of my culling and editing on Lightroom mobile from my iPad. 

I'm going to try to pretend it's not ridiculous that $6k worth of top-of-the-line computer hardware can't compete with a 3 year old tablet.  Adobe, I really need a response on this - please acknowledge there are issues with the software and let me know how I can help resolve them - I'm willing to work with any support techs willing to walk through the performance issues and see the demonstrable difficulties.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
Jul 21, 2015 Jul 21, 2015

Copy link to clipboard

Copied

I'm sure they know very well about all the issues. It is impossible that all of us have these issues and they don't have a problem on a single computer in their labs. I'm convinced too they just can't accept they made a bad software publicly because that could affect they company. So we just have to keep using 5.7.1 and hope they keep releasing updated for it until 6.x becomes a usable piece of software.

Sent from the Moon

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jul 21, 2015 Jul 21, 2015

Copy link to clipboard

Copied

So strange!  I'm not sure what gives.  I just got through an entire photo shoot, from import to production, in a day without issue.  It was pretty snappy, and I think an update since I did my video has actually addressed some of the things I pointed out in the video (maybe they secretly referred to it). 

I was on two machines, both of which are less powerful than yours.  The first is a custom desktop with the following specs

Asus Rampage IV Gene

Intel i7-4930k @ 4.5 Ghz

EVGA SC GTX 780 ti & 750 ti

32 GB Mushkin Redline (PC17000)

Samsung EVO 750 GB

2 x 4 TB Constellation ES.3 in RAID 1

Ceton InfiniTV-4 PCIe

Corsair H110i GT Cooler

Corsair 350D Case

The second is a laptop (Razer Blade) with the following specs

Intel Core i7-4720HQ 2.6 GHz (up to 3.6 GHz)
Memory
16GB DDR3L
Storage
512 GB SSD
Graphics Card
NVIDIA GeForce GTX 970M
Video Memory
3GB

Everything ran pretty smooth on both machines, even though the laptop is a quad core. 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jul 21, 2015 Jul 21, 2015

Copy link to clipboard

Copied

Doing exactly what at the development module? Easy stuff is not that slow. Is mainly stuff using brushes and local adjustments.

Sent from the Moon

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Jul 21, 2015 Jul 21, 2015

Copy link to clipboard

Copied

I'm extremely jealous - 117 images from last night's shoot took a little over an hour to generate standard and 1:1 preview for - swapping between images took 6-9 seconds (in development mode) after previews were rendered.  Adjustments made (basic, curves, detail, non-local etc) took 1-2 seconds each to render on the screen.  I basically spent 20 or so minutes on 1 image and then Sync'd it with the rest and uploaded to Lightroom Mobile so that I could cull/make adjustments while waiting for the standards and 1:1's to generate.  Of course face detection and address lookup were turned off.  This was done on a 5DS, however most of my shoots are from a 5DM3. 

I didn't invest in two 4k monitors to have to abandon them for a iPad to work on.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jul 21, 2015 Jul 21, 2015

Copy link to clipboard

Copied

victorwolansky‌ - I did a full range of developing intensities.  Some images had little to no edits, while others had very intensive editing.  For the intensive edits, I did all primary global adjustments first.  Then I got into local adjustments with the brush tool, the radial filter too (modified with ITS built-in brush), and the spot removal tool.  While using brushes, I made it even harder than normal (on the desktop, not the laptop) by using a Wacam tablet with pressure sensitivity and various feathers applied through hundreds of strokes.  Don't get me wrong, it was stressing the system, but it still didn't get laggy. 

kenb0t‌ - The last round I did was about 350 photos.  I rendered standard previews on import, which complete in slightly less than 1 sec each.  Since previews are tied to resolution, I'd expect more work on your setup, but it ran as a background process anyway that only used about 20% of the CPU.  As I'm switching through the develop module right now, the images will change as I let up the arrow key on my keyboard - not instant, but still quick and perfectly acceptable.  I'm also running the 1:1 previews just to check for you, and each one is taking about 3 sec (still as a background process using less than 1/3 of the CPU).  Those previews aren't tied to resolution, and the (relatively large) sources files come from a new Sony camera with 24 MP RAW images.  Perhaps LR deals with Sony's files better.  I do recall trying DNG files, which I did as a conversion on import at the time.  The processing was a little faster, thought I turned it off because it made imports take longer.  Maybe you can try re-importing the images with a built-in DNG conversion.

You might also try completely uninstalling LR and deleting all traces left behind before reinstalling it.  Maybe there's a corruption somewhere in the installation.  Also, is your CPU overclocked?  LR benefits a lot from single-core speed.  Mine is set to 4.5 GHz, which is 1.1 GHz faster than stock.  Also, the motherboard I have allows all cores and threads to achieve full speed (here, 4.5 HGz) at the same time, which isn't possible by default.  Generally, Intel CPUs will allow full speed on only 2 or less cores at once; otherwise the CPU will go down to it's (non-turbo) speed, which would be 3.2 GHz on my CPU.  That would be less than half the speed I get now.  Motherboards don't alway have that capability because it's a non-standard feature.  Although, you have an Alienware computer, so maybe you have a supporting motherboard.  Either way, the 8-core CPUs run slower per-core than 6-core CPUs, which in turn run slower per-core than 4-core CPUs.  The chips can only take so much heat, so they have to make each core a little slower to accommodate that.  So, more cores (before overclocking) generally means slower cores, and less cores generally means faster cores.  If a program can use a lot of cores, the performance loss per-core is more than offset by additional cores for a net benefit; otherwise, an inefficient program will use only a few of the slower cores for a net loss.  Point is, in addition to a reinstallation, you might want to try overclocking to as fast as you can within stable and safe limits. 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Jul 21, 2015 Jul 21, 2015

Copy link to clipboard

Copied

Thanks for the detailed response - I'll attempt a full reinstall, but it would be my 3rd time doing so.  I have 8 cores;

Intel® Core™ i7-5960X (8-cores, 20MB Cache, Overclocked to 4.0 GHz)

32GB Quad Channel DDR4 at 2133MHz

Triple NVIDIA® GeForce® GTX 980 graphics with 12GB total (3x 4GB) GDDR5 -NVIDIA SLI® Enabled (which I've also tried using NVidia Control Panel to enforce using a SINGLE GPU for Lightroom only but it seems to make things worse)

and a pair of LG 31MU97-B: 31 Inch 4K IPS Monitors,

but at the speed they're at, I wouldn't expect the lagginess I'm seeing, it just doesn't make sense....

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines