Skip to main content
MadManChan2000
Adobe Employee
Adobe Employee
April 26, 2015
Question

GPU notes for Lightroom CC (2015)

  • April 26, 2015
  • 84 replies
  • 195464 views

Hi everyone,

I wanted to share some additional information regarding GPU support in Lr CC.

Lr can now use graphics processors (GPUs) to accelerate interactive image editing in Develop. A big reason that we started here is the recent development and increased availability of high-res displays, such as 4K and 5K monitors. To give you some numbers: a standard HD screen is 2 megapixels (MP), a MacBook Retina Pro 15" is 5 MP, a 4K display is 8 MP, and a 5K display is a whopping 15 MP. This means on a 4K display we need to render and display 4 times as many pixels as on a standard HD display. Using the GPU can provide a significant speedup (10x or more) on high-res displays. The bigger the screen, the bigger the win.

For example, on my test system with a 4K display, adjusting the White Balance and Exposure sliders in Lightroom 5.7 (without GPU support) is about 5 frames/second -- manageable, but choppy and hard to control. The same sliders in Lightroom 6.0 now run smoothly at 60 FPS.

So why doesn't everything feel faster?

Well, GPUs can be enormously helpful in speeding up many tasks. But they're complex and involve some tradeoffs, which I'd like to take a moment to explain.

First, rewriting software to take full advantage of GPUs is a lot of work and takes time. Especially for software like Lightroom, which offers a rich feature set developed over many years and release versions. So the first tradeoff is that, for this particular version of Lightroom, we weren't able to take advantage of the GPU to speed up everything. Given our limited time, we needed to pick and choose specific areas of Lightroom to optimize. The area that I started with was interactive image editing in Develop, and even then, I didn't manage to speed up everything yet (more on this later).

Second, GPUs are marvelous at high-speed computation, but there's some overhead. For example, it takes time to get data from the main processor (CPU) over to the GPU. In the case of high-res images and big screens, that can take a LOT of time. This means that some operations may actually take longer when using the GPU, such as the time to load the full-resolution image, and the time to switch from one image to another.

Third, GPUs aren't best for everything. For example, decompressing sequential bits of data from a file -- like most raw files, for instance -- sees little to no benefit from a GPU implementation.

Fourth, Lightroom has a sophisticated raw processing pipeline (such as tone mapping HDR images with Highlights and Shadows), and running this efficiently on a GPU requires a fairly powerful GPU. Cards that may work with in the Photoshop app itself may not necessarily work with Lightroom. While cards that are 4 to 5 years old may technically work, they may provide little to no benefit over the regular CPU when processing images in Lr, and in some cases may be slower. Higher-end GPUs from the last 2 to 3 years should work better.

So let's clear up what's currently GPU accelerated in Lr CC and what's not:

First of all, Develop is the only module that currently has GPU acceleration whatsoever. This means that other functions and modules, such as Library, Export, and Quick Develop, do not use the GPU (performance should be the same for those functions regardless of whether you have GPU enabled or disabled in the prefs).

Within Develop, most image editing controls have full GPU acceleration, including the basic and tone panel, panning and zooming, crop and straighten, lens corrections, gradients, and radial filter. Some controls, such as local brush adjustments and spot clone/heal, do not -- at least, not yet.

While the above description may be disappointing to some of you, let's be clear: This is the beginning of the GPU story for Lightroom, not the end. The vision here is to expand our use of the GPU and other technologies over time to improve performance. I know that many photographers have been asking us for improved performance for a long time, and we're trying to respond to that. Please understand this is a big step in that direction, but it's just the first step. The rest of it will take some time.

Summary:

1. GPU support is currently available in Develop only.

2. Most (but not all) Develop controls benefit from GPU acceleration.

3. Using the GPU involves some overhead (there's no free lunch). This may make some operations take longer, such as image-to-image switching or zooming to 1:1. Newer GPUs and computer systems minimize this overhead.

4. The GPU performance improvement in Develop is more noticeable on higher-resolution displays such as 4K. The bigger the display, the bigger the win.

5. Prefer newer GPUs (faster models within the last 3 years). Lightroom may technically work on older GPUs (4 to 5 years old) but likely will not benefit much. At least 1 GB of GPU memory. 2 GB is better.

6. We're currently investigating using GPUs and other technologies to improve performance in Develop and other areas of the app going forward.

The above notes also apply to Camera Raw 9.0 for Photoshop/Bridge CC.

Eric Chan

Camera Raw Engineer

This topic has been closed for replies.

84 replies

Inspiring
April 26, 2015

I posted the following information on the Luminous Landscape forum (GPU used in Develop but not Library?) in response to comments you made there.

I am very puzzled by the extremely blurry image in the second screen capture when the GPU is enabled.

OS X (10.9.5)

Hardware configuration:

   MacPro (late 2013)

   AMD FirePro D300 2048 MB

   Apple Cinema Display 1920 x 1200

   16 GB RAM

   1 TB SSD


Test file:  Nikon D800 NEF, 50 MB

(0)  open the Develop module

(1)  select a different NEF file and zoom to 1:1

(2)  clear the ACR cache

(3)  select the test file

(4)  take 3 screenshots to illustrate the 3 display states (the first one is hard to capture)

(5)  select another image

(6)  same 3 states are present

(7)  return to the test file and the same 3 display states are present

   Why isn’t the ACR cache coming into play in step 7?

If I repeat this process with the GPU disabled the image is displayed without the intermediate states.

I have attached the 3 screenshots mentioned in step (4).

TheDigitalDog
Inspiring
May 11, 2015

Bob_Peters wrote:

I am very puzzled by the extremely blurry image in the second screen capture when the GPU is enabled.

With GPU on, I see a slight color shift instead of the blurry image, then the color becomes properly rendered on-screen. With GPU off, I don't see this at all.

Author “Color Management for Photographers" & "Photoshop CC Color Management/pluralsight"
C-OW
Inspiring
April 26, 2015

‌Theres one thing I don't understand: You write that local brush doesn't use GPU. But how come that when I switch on GPU support  in the settings that the brush is soooo much slower than without GPU usage? OK, my video card is not a rocket but I have trouble to understand it...

Oli

D Fosse
Community Expert
Community Expert
April 26, 2015

I see a reasonable explanation in Eric's post:

Second, GPUs are marvelous at high-speed computation, but there's some overhead. For example, it takes time to get data from the main processor (CPU) over to the GPU. In the case of high-res images and big screens, that can take a LOT of time. This means that some operations may actually take longer when using the GPU, such as the time to load the full-resolution image, and the time to switch from one image to another.


While cards that are 4 to 5 years old may technically work, they may provide little to no benefit over the regular CPU when processing images in Lr, and in some cases may be slower. Higher-end GPUs from the last 2 to 3 years should work better.

Bottom line - if OpenGL doesn't speed things up, just turn it off. If that's still slower than 5.7, there's a legitimate reason for concern. Otherwise you're no worse off.

The way I read this, the main immediate priority was to make 4K and 5K usable. I trust further optimization for the rest of us will come in future releases.

Known Participant
April 26, 2015

Tested it also with GPU support off and it is still slower than LR 5.7.1 I posted videos comparing in my original post about poor performance on LR 6

Sent from the Moon

Participant
April 26, 2015

Dear Eric,

Thank you for your exhaustive explanation about some very obvious issue here. But the simple basic common sense makes me still wonder:

After opening and start doing the everyday tasks on my poor 1920x1080 screen resolution with my very poor GTX 680M GPU - having 2 of them in SLI, but I guess the second one is left to rest in peace  - not to mention that the SLI mode was already existing in LR4 times -  why the very same things are SLOWER on LR6 compared with the 5 version.

Is the price we have to pay to get better PP pics out of this version? I still couldn't see a visible difference, but maybe are just my eyes not enough trained to see some.

I didn't see anybody complaining about the missing of some stellar speeds here, is just about to have (at least) the very same speed as the previous version.

Hopefully LR6 will get in the right track after some times.

Known Participant
April 26, 2015

Actually nothing feels faster on LR CC. Check on my post which includes videos about it.

Lightroom CC brush hyper slow

Have 7 full CC seats , and really not happy at all with this release. Even on my HP Z820 with 128 GB or RAM and a Quadro 6000 LR CC crawls compared to LR 5.7.1

99jon
Genius
April 26, 2015

It is worth changing the preview size to the new Auto settings, if required.

From the LR menu go to:

Edit >> Catalog Settings >> File Handling (tab)

Lightroom >> Catalog Settings >> File Handling (tab) on Mac

For Standard Preview Size choose Auto from the drop-down list.

This is the default setting if you are using LR6 for the first time and it’s particularly useful for retina and HiDpi displays but will optimize LR for any monitor. For those using LR5 or previous versions, it is recommended that the preview size is changed to the Auto setting.