Skip to main content
MadManChan2000
Adobe Employee
Adobe Employee
April 26, 2015
Question

GPU notes for Lightroom CC (2015)

  • April 26, 2015
  • 84 replies
  • 195463 views

Hi everyone,

I wanted to share some additional information regarding GPU support in Lr CC.

Lr can now use graphics processors (GPUs) to accelerate interactive image editing in Develop. A big reason that we started here is the recent development and increased availability of high-res displays, such as 4K and 5K monitors. To give you some numbers: a standard HD screen is 2 megapixels (MP), a MacBook Retina Pro 15" is 5 MP, a 4K display is 8 MP, and a 5K display is a whopping 15 MP. This means on a 4K display we need to render and display 4 times as many pixels as on a standard HD display. Using the GPU can provide a significant speedup (10x or more) on high-res displays. The bigger the screen, the bigger the win.

For example, on my test system with a 4K display, adjusting the White Balance and Exposure sliders in Lightroom 5.7 (without GPU support) is about 5 frames/second -- manageable, but choppy and hard to control. The same sliders in Lightroom 6.0 now run smoothly at 60 FPS.

So why doesn't everything feel faster?

Well, GPUs can be enormously helpful in speeding up many tasks. But they're complex and involve some tradeoffs, which I'd like to take a moment to explain.

First, rewriting software to take full advantage of GPUs is a lot of work and takes time. Especially for software like Lightroom, which offers a rich feature set developed over many years and release versions. So the first tradeoff is that, for this particular version of Lightroom, we weren't able to take advantage of the GPU to speed up everything. Given our limited time, we needed to pick and choose specific areas of Lightroom to optimize. The area that I started with was interactive image editing in Develop, and even then, I didn't manage to speed up everything yet (more on this later).

Second, GPUs are marvelous at high-speed computation, but there's some overhead. For example, it takes time to get data from the main processor (CPU) over to the GPU. In the case of high-res images and big screens, that can take a LOT of time. This means that some operations may actually take longer when using the GPU, such as the time to load the full-resolution image, and the time to switch from one image to another.

Third, GPUs aren't best for everything. For example, decompressing sequential bits of data from a file -- like most raw files, for instance -- sees little to no benefit from a GPU implementation.

Fourth, Lightroom has a sophisticated raw processing pipeline (such as tone mapping HDR images with Highlights and Shadows), and running this efficiently on a GPU requires a fairly powerful GPU. Cards that may work with in the Photoshop app itself may not necessarily work with Lightroom. While cards that are 4 to 5 years old may technically work, they may provide little to no benefit over the regular CPU when processing images in Lr, and in some cases may be slower. Higher-end GPUs from the last 2 to 3 years should work better.

So let's clear up what's currently GPU accelerated in Lr CC and what's not:

First of all, Develop is the only module that currently has GPU acceleration whatsoever. This means that other functions and modules, such as Library, Export, and Quick Develop, do not use the GPU (performance should be the same for those functions regardless of whether you have GPU enabled or disabled in the prefs).

Within Develop, most image editing controls have full GPU acceleration, including the basic and tone panel, panning and zooming, crop and straighten, lens corrections, gradients, and radial filter. Some controls, such as local brush adjustments and spot clone/heal, do not -- at least, not yet.

While the above description may be disappointing to some of you, let's be clear: This is the beginning of the GPU story for Lightroom, not the end. The vision here is to expand our use of the GPU and other technologies over time to improve performance. I know that many photographers have been asking us for improved performance for a long time, and we're trying to respond to that. Please understand this is a big step in that direction, but it's just the first step. The rest of it will take some time.

Summary:

1. GPU support is currently available in Develop only.

2. Most (but not all) Develop controls benefit from GPU acceleration.

3. Using the GPU involves some overhead (there's no free lunch). This may make some operations take longer, such as image-to-image switching or zooming to 1:1. Newer GPUs and computer systems minimize this overhead.

4. The GPU performance improvement in Develop is more noticeable on higher-resolution displays such as 4K. The bigger the display, the bigger the win.

5. Prefer newer GPUs (faster models within the last 3 years). Lightroom may technically work on older GPUs (4 to 5 years old) but likely will not benefit much. At least 1 GB of GPU memory. 2 GB is better.

6. We're currently investigating using GPUs and other technologies to improve performance in Develop and other areas of the app going forward.

The above notes also apply to Camera Raw 9.0 for Photoshop/Bridge CC.

Eric Chan

Camera Raw Engineer

This topic has been closed for replies.

84 replies

Alain Capéran
Participant
May 3, 2015

Have you tried this tips I applied on my system:

Lightroom 6: GPU works too slowly --> Tips to improve it

To do short I desactivated my antivirus on lightroom activity (only).

The result is incredible. Why? Don't know, I supposed that Lightroom needs to access too many files at the same time...

Participant
May 3, 2015

not worked for me,

I have also Kaspersky 2015 Internet Security

Tested on my laptop

I7 2.4 Ghz 8Gb Ram Radeon 7000 mobile

and on my desktop

i7 2.6 Ghz 8Gb Ram Nvidia Gtx 560 ti 448

salvo

Known Participant
May 3, 2015

Antivirus is not the problem. On the machines at the office there is no antivirus installed, so that can't be the reason. Adobe just should chime in and say, people, we are sorry, we screwed it. Please keep using 5.7.1 until we get Lightroom CC in a usable shape. It feels like if it was not even beta tested. Was it?

Sent from the Moon

Sean H [Seattle branch]
Known Participant
May 3, 2015

Curious. If the CPU->GPU transfer takes a bit of time, isn't it possible to have a core that's always tasked at transferring future data to the gpu (lazy loading). This would be images N+1, N+2... It's pretty predictable that most of us work sequentially.

I'm on a 4.7GHz 4790K i7 with 32GB and a R9 280X. I agree with people here: LR5 seems much slower for all other functions than slider responsiveness which is much faster. Heal/Clone feels like I'm working on a 8MHz 286 CPU.

[ ◉"]
Known Participant
May 3, 2015

The time of transfer from CPU to GPU should be almost unnoticeable if correctly implemented, PCIe busses can move 255 MB/s per like, provided you have a cheap card working on just 4 lanes that is 1000MB/s how many images you move in a second with 1000 MB? Lightroom should completely decompress a RAW file as soon as you touch it, and not wait until you zoom in to do it, also there are plenty of glitches when GPU is enabled and you use multiple brushes, you can get horrendous artifacts, the GUI flickers, etc etc.... The person that decided that this was ready to be released, should be released from work.... Or was clearly a very poor marketing decision, some times programmers say, this is not ready and marketing comes back with a don't care sent it out anyway .....

Sent from my iPad

deejjjaaaa
Inspiring
May 3, 2015

> also there are plenty of glitches when GPU is enabled and you use multiple brushes, you can get horrendous artifacts, the GUI flickers, etc etc..

when disabled - too... for example  when I switch off GPU in ACR 9 , I manage to draw a several visible (at the same time = at once) rectangles with WB sampling too... just like switching GPU off somehow makes ACR9 to work a magnitude (or more) slower than ACR8

Participant
May 2, 2015

Personally

I would just be happy if I could view my photos in the Develop module like I used to be able to in LR5. All the commentary leave s me a bit underwhelmed. If it wasn't going to work better or barely I would have thought Adobe should have shave provided better product guidance or a test to indicate if LR6 was going to run properly on my computer.

Very disappointed

Participant
May 2, 2015

‌I also very disappointed when using Lightroom cc.

its performance is very slow and always showing no response.

Participant
May 1, 2015

with this information...now I'm even more disappointed that my import time has slowed down so significantly.  An Adobe rep on these forums told me to disable the GPU...but now I know the GPU has NOTHING to do with importing.  So WTF Adobe?

deejjjaaaa
Inspiring
May 1, 2015

WTF was not necessary though...  something is certainly not right with the first production version that uses GPU, I bet they are working to fix it and developers probably were under a pressure from mgmt/marketing to deliver LR6 by a certain time

Participating Frequently
May 1, 2015

I hope someone at Adobe is listening.

GPU Acceleration is not working properly. And it's not just an AMD problem. Nvidia cards fail the test also.

A couple of days ago I upgraded my card from an OEM GTX660 to an Asus GTX960 in the hope that it would be solved. No such luck. On top of that the log shows that my old card is still installed, although I removed all drivers and began installation from scratch. Even in the hidden devices there is no trace of the old card. Am I missing something here?

I use Win 8.1 pro

Participating Frequently
May 19, 2015

Here's what worked for my setup.

I use an nvidia 960 card.

First I experimented with doing clean installs of the drivers. That didn't work.

But I noticed that when I used the default windows driver (the non-nvidia kind) I got acceleration. Though none of the other advantages of my graphics card.

Then I went to this page: TweakGuides.com - Nvidia GeForce Tweak Guide and learned about Display Driver Uninstaller (DDU).

After applying this free program in safe mode and reinstalling the nvidia driver, acceleration was available.

So, for all you nvidia users: try this before reinstalling lightroom or reinstalling your entire system. It took about 20 minutes in all.

This might also work for AMD users, since the program provides an AMD setting.

Todd Shaner
Legend
May 19, 2015

There is a 'Clean Install' checkbox in the Nvidia Driver Installer that removes the current drivers, Registry settings, and all Preferences settings. As per Nvidia at the above link, "If you have experienced install problems in the past, we offer "Perform clean install" which will remove all previous NVIDIA drivers and NVIDIA registry entries from your computer.  This may resolve installer issues caused by conflicting older drivers."

Select Custom:

Image

Select 'Perform clean install:'

Participant
May 1, 2015

Thank you Eric, I have a MacBook Pro Mid 2012, LR 6 is far better, Develop Module is faster, I have NOT notice any loss in any other areas,

I also have a 2014 Mac Pro Dual AMD Fire Pro 700's, LR 6 is like  runs super fast,

Looking forward to the next updates, I hope Adobe will make use of 12 Core machines.

willteeyang
Participating Frequently
May 1, 2015

I feel like its slower to use GPU acceleration! anyone else on the same boat?

Late 2013 Mac Pro

3.5Ghz 6-core Xeon E5

32GB 1867 DDR3 ECC

Dual AMD FirePro D700 6GB

Using GPU acceleration it takes about 4-5 seconds to load a full res raw from Canon 5d3,

turning GPU acceleration off im averaging 3-4 seconds per image.

(same speed either at 1:1 or Fit )

I thought GPU acceleration supposed to be faster, not all cases?

Thanks!

Victoria Bampton LR Queen
Community Expert
Community Expert
May 1, 2015

will yang wrote:

Using GPU acceleration it takes about 4-5 seconds to load a full res raw from Canon 5d3,

Some things are slower, yes. Eric said above "it takes time to get data from the main processor (CPU) over to the GPU. In the case of high-res images and big screens, that can take a LOT of time. This means that some operations may actually take longer when using the GPU, such as the time to load the full-resolution image, and the time to switch from one image to another."

Victoria - The Lightroom Queen
willteeyang
Participating Frequently
May 1, 2015

Thanks.

Would you recommend keeping GPU off for develop mode to do color adjustment on raw images?

and switch GPU on when exporting?

Participating Frequently
April 29, 2015

I'd just like to add that i've also been  really disappointed with LR CC speed in the Loupe, which made image culling almost impossible, UNTIL I set the preview size to something other than auto. Even setting it to the smallest option, and medium quality, i see huge speed improvements across the board when in Loupe, with little noticeable render time.

Participant
April 27, 2015

> Some controls, such as local brush adjustments and spot clone/heal, do not -- at least, not yet. (have full GPU acceleration) For me: (Windows 8 x64, I7-920,  8 GB RAM , Nvidia GTX 560 Ti 448, Dell U2713HM (2560 x 1440) I am almost certain that the brush adjustment is impacted by the GPU: If the acceleration is activated, the brush adjustment lags in my case, If the GPU acceleration is not enabled, wahoo the brush adjustment not lag and it is quick as LR5.7.1. or almost So, the local brush adjustment uses the GPU acceleration, but near to be optimized. (for me)

dj_paige
Legend
April 27, 2015

So, the local brush adjustment uses the GPU acceleration

Doesn't the evidence you just presented lead to the opposite conclusion?

Participant
April 27, 2015

Hi Dj_paige.

said in the first post :

"Within Develop, most image editing controls have full GPU acceleration, including the basic and tone panel, panning and zooming, crop and straighten, lens corrections, gradients, and radial filter. Some controls, such as local brush adjustments and spot clone/heal, do not -- at least, not yet."

So My conclusion Is: contrary to what is written by MadManChan200,  GPU acceleration affects the brush adjustment .... but not in the good way.

Salvo

DClark064
Inspiring
April 27, 2015

Thanks for the information on GPU acceleration.  You state that newer cards with 2GB of memory are recommended. My card is 2 years old and has 2GB but I see no improvement in speed.  It would be helpful if there were some more specific recommendations for the cuda cores, bus speeds, and memory that would result in improved performance.  I would upgrade if I knew what would make a difference.

Known Participant
April 27, 2015

‌iI would wait until there is more optimization. I have a Quadro 6000 that is one of the most powerful cards and performance on anything other than zoom in and zoom out is faster with the GPU turned off. I think there is a lot to be done yet.

DClark064
Inspiring
April 27, 2015

Thanks for the suggestion.  Unless I see something more specific from Adobe I will follow your advice.  No sense spending a few hundred until there is something to be gained.  In particular since I don't have a 4K or 5K display.