Skip to main content
MadManChan2000
Adobe Employee
Adobe Employee
April 26, 2015
Question

GPU notes for Lightroom CC (2015)

  • April 26, 2015
  • 84 replies
  • 195458 views

Hi everyone,

I wanted to share some additional information regarding GPU support in Lr CC.

Lr can now use graphics processors (GPUs) to accelerate interactive image editing in Develop. A big reason that we started here is the recent development and increased availability of high-res displays, such as 4K and 5K monitors. To give you some numbers: a standard HD screen is 2 megapixels (MP), a MacBook Retina Pro 15" is 5 MP, a 4K display is 8 MP, and a 5K display is a whopping 15 MP. This means on a 4K display we need to render and display 4 times as many pixels as on a standard HD display. Using the GPU can provide a significant speedup (10x or more) on high-res displays. The bigger the screen, the bigger the win.

For example, on my test system with a 4K display, adjusting the White Balance and Exposure sliders in Lightroom 5.7 (without GPU support) is about 5 frames/second -- manageable, but choppy and hard to control. The same sliders in Lightroom 6.0 now run smoothly at 60 FPS.

So why doesn't everything feel faster?

Well, GPUs can be enormously helpful in speeding up many tasks. But they're complex and involve some tradeoffs, which I'd like to take a moment to explain.

First, rewriting software to take full advantage of GPUs is a lot of work and takes time. Especially for software like Lightroom, which offers a rich feature set developed over many years and release versions. So the first tradeoff is that, for this particular version of Lightroom, we weren't able to take advantage of the GPU to speed up everything. Given our limited time, we needed to pick and choose specific areas of Lightroom to optimize. The area that I started with was interactive image editing in Develop, and even then, I didn't manage to speed up everything yet (more on this later).

Second, GPUs are marvelous at high-speed computation, but there's some overhead. For example, it takes time to get data from the main processor (CPU) over to the GPU. In the case of high-res images and big screens, that can take a LOT of time. This means that some operations may actually take longer when using the GPU, such as the time to load the full-resolution image, and the time to switch from one image to another.

Third, GPUs aren't best for everything. For example, decompressing sequential bits of data from a file -- like most raw files, for instance -- sees little to no benefit from a GPU implementation.

Fourth, Lightroom has a sophisticated raw processing pipeline (such as tone mapping HDR images with Highlights and Shadows), and running this efficiently on a GPU requires a fairly powerful GPU. Cards that may work with in the Photoshop app itself may not necessarily work with Lightroom. While cards that are 4 to 5 years old may technically work, they may provide little to no benefit over the regular CPU when processing images in Lr, and in some cases may be slower. Higher-end GPUs from the last 2 to 3 years should work better.

So let's clear up what's currently GPU accelerated in Lr CC and what's not:

First of all, Develop is the only module that currently has GPU acceleration whatsoever. This means that other functions and modules, such as Library, Export, and Quick Develop, do not use the GPU (performance should be the same for those functions regardless of whether you have GPU enabled or disabled in the prefs).

Within Develop, most image editing controls have full GPU acceleration, including the basic and tone panel, panning and zooming, crop and straighten, lens corrections, gradients, and radial filter. Some controls, such as local brush adjustments and spot clone/heal, do not -- at least, not yet.

While the above description may be disappointing to some of you, let's be clear: This is the beginning of the GPU story for Lightroom, not the end. The vision here is to expand our use of the GPU and other technologies over time to improve performance. I know that many photographers have been asking us for improved performance for a long time, and we're trying to respond to that. Please understand this is a big step in that direction, but it's just the first step. The rest of it will take some time.

Summary:

1. GPU support is currently available in Develop only.

2. Most (but not all) Develop controls benefit from GPU acceleration.

3. Using the GPU involves some overhead (there's no free lunch). This may make some operations take longer, such as image-to-image switching or zooming to 1:1. Newer GPUs and computer systems minimize this overhead.

4. The GPU performance improvement in Develop is more noticeable on higher-resolution displays such as 4K. The bigger the display, the bigger the win.

5. Prefer newer GPUs (faster models within the last 3 years). Lightroom may technically work on older GPUs (4 to 5 years old) but likely will not benefit much. At least 1 GB of GPU memory. 2 GB is better.

6. We're currently investigating using GPUs and other technologies to improve performance in Develop and other areas of the app going forward.

The above notes also apply to Camera Raw 9.0 for Photoshop/Bridge CC.

Eric Chan

Camera Raw Engineer

This topic has been closed for replies.

84 replies

Participant
August 23, 2016

Hello, I'm meeting serious speed issues with LR CC, could you precise the actual situation (august 2016), what is accelerating by the GPU and what not?

Regards.

Sean H [Seattle branch]
Known Participant
August 5, 2016

Is this going to be re-coded in OpenCL, a computational library, vs the GL which is for gaming?

[ ◉"]
jwArtWorks
Known Participant
June 22, 2016

I use Photoshop CC and Lightroom CC on my computer and Photoshop is quite a bit faster in overall performance. In my PS config, my scratch drives are both SSD drives which I am guessing optimizes the PS performance.

I have been wondering for a long time why LR does not use scratch drives. I know it uses proxies instead of original images but it seems to me that even with proxies, they could use scratch drives to read ahead and enhance the cache abilities.

I optimize my catalogs every time I use them as the box is checked and I backup after every editing session. Recently, I had to create a new catalog and import form a previous catalog to fix a huge speed issue. This worked and made me wonder what is the purpose of optimizing? Why doesn't the program actually do a serious integrity test and repair as well to avoid these issues?

Now, there is this memory leak that really slows down the longer you are using Lightroom. I know I can roll back to the prior version but I really do not have time to keep playing around with the software - I just need to do my work.

Yesterday (or the day before) I received an email from Adobe telling me there are all these new additions to the Creative Cloud. I am sure all of us would rather they focus on fixing all the issues that exist in the current products before working on adding new products. How can I spend time exploring the new offerings when I am constantly delayed by the software I am currently using.

As to Victor's comment about graphics cards, "With GPUs having 4 to 8 GB or RAM today, they should work with the real files." , my card has 12Gb of RAM and the speed with and without the GPU is about the same. Slightly faster with GPU but many things are slower making the overall effect negligible.

In conclusion, I love Lightroom and I love Photoshop and many other Adobe products and services. I just wish they would take the comments from long-time users (I started with Version 1 beta) seriously and blame us a little less.

Known Participant
June 22, 2016

As to Victor's comment about graphics cards, "With GPUs having 4 to 8 GB or RAM today, they should work with the real files." , my card has 12Gb of RAM and the speed with and without the GPU is about the same. Slightly faster with GPU but many things are slower making the overall effect negligible.

That is because they are NOT using GPU efficiently. I'm using other programs to treat images that uses GPU efficiently. Being one Lustre by Autodesk, and it does all what LR does in real time at 30 frames at second in 4K with several layers of secondary color corrections. So don't tell me it can't be done. It just the LR team either does not know how to do it or does not have the resources to do it.

Sent from a Rectangle thing with pretty small keyboard for giant hands. So excuse the typos.

micheleb51582477
Participant
June 22, 2016

Hello everybody,

I have same problem with my LR software.

Fist I thought something was wrong with my computer, then I google a bit and I found this post.

Good news: I'm not the only one.

Bad news: the software LR has issues.

It is not problem about hardware (some people here have hardware with requirements above what is "standard requirements" from LR).

The problem is the software.

It could happen that some LR software developers did mistakes (humans normally do it)... Just fix it and users will be happy.

And please, stop to continue to say that problem is connected to users' hardware.

I am talking from 16 years of experience working with computers (from developers to sys admin)

Known Participant
June 22, 2016

It seems to be a normal response by Adobe to not take responsibility and blame the users. I have 25 years working with computers. I have use photoshop since I can recall, and even though they are doing awesome things with "Photoshop", the work with Lightroom has a lot to desire to be called a good job. Performance keeps suffering more and more with each version that comes out.

Sent from a Rectangle thing with pretty small keyboard for giant hands. So excuse the typos.

jwArtWorks
Known Participant
June 21, 2016

I am betting you thought you were done listening to my complaining but unfortunately for you and me, I am back again.

As per all the suggestions, I created a new catalog, deleted my preferences and the startup preferences (just in case) and both of these did speed up operations when I first tested them. Now, I have discovered another issue.

If I open Lightroom and edit 1-30 images and then exit Lightroom the speed is not too bad. In fact, it is fine! However, when I spend a few hours editing images from a big photo shoot, the speed slowly deteriorates until Lightroom becomes so slow that I would be better off in Photoshop or sending film to the lab.

I am going to upload two videos to show the difference.

This video is after working for 30+ minutes editing photos:

The second video shows the speed after exiting Lightroom, making a backup (with both checkboxes checked), and restarting Lightroom:

Todd Shaner
Legend
June 22, 2016

If you updated to LR 6.6 or CC2015.6 there is a memory leak issue that has been identified: Re: Lightroom 2015.6 Painfully slow since update

You can roll-back to 6.5.1 or CC2015.5.1 as outlined here:Re: Lightroom 2015.6 Painfully slow since update

Known Participant
June 22, 2016

So definitely Adobe is not testing their product before release. The speed issues were visible in 10 seconds after update for me.

Sent from a Rectangle thing with pretty small keyboard for giant hands. So excuse the typos.

jwArtWorks
Known Participant
May 9, 2016

I have had the GPU turned off on My HP Z210 Workstation with the Graphics shown below. I decided to upgrade to take advantage of the GPU for Lightroom CC and Photoshop CC so I bought a new HP Workstation with details shown below as well.

After testing, I see no improvement in Lightroom CC and Photoshop CC is worse with it on than off. Considering the unit I purchased, I expected blazing speed (which I see in all other applications) and it simply is not happening. Everything else I do is super fast and considerably faster than my old workstation. It may be important to note that both LR and PS load much, much faster on the new unit.

Now what???

OLD: HP Z210 Workstation - Win 10 64 bit - 16gb Ram, 3.4Ghz Core i7 Processor (4 core), 500Gb HD 7200RPM

System Drive, Separate 1Tb 7200RPM scratch drive for PS, USB 3.0 Drobo for Photos.

Graphics Processor Info:

Check OpenGL support: Failed

Vendor: ATI Technologies Inc.

Version: 3.3.12002 Core Profile Context 9.12.0.0

Renderer: AMD Radeon HD 5500 Series

LanguageVersion: 4.20

GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS (actual = 32, minimum = 48)

NEW: HP Z640 Workstation - Win 10 Pro 64 bit, 64gb Ram, 3.7Ghz Xeon V3 Processor (8 core), 1Tb Turbo SSD

System Drive, Separate 1Tb 7200RPM scratch drive for PS, USB 3.0 Drobo for Photos.

Graphics Processor Info:

Quadro M4000/PCIe/SSE2

Check OpenGL support: Passed

Vendor: NVIDIA Corporation

Version: 3.3.0 NVIDIA 358.87

Renderer: Quadro M4000/PCIe/SSE2

LanguageVersion: 3.30 NVIDIA via Cg compiler

Todd Shaner
Legend
May 9, 2016

You should be aware that GPU Support is only provided in the Develop module: GPU notes for Lightroom CC (2015)

jwArtWorks wrote:

After testing, I see no improvement in Lightroom CC and Photoshop CC is worse with it on than off. Considering the unit I purchased, I expected blazing speed (which I see in all other applications) and it simply is not happening.

So just exactly what does "blazing speed" mean and when doing what in LR is it"simply not happening?"

Your Xeon processor is ~2x the performance of my i7-860 and the Quadro M4000 is about 10x the performance of my Quadro 600 GPU. With GPU enabled the Basic panel sliders are lag free and with it disabled theses sliders have slight lag. The Spot Removal tool is just the opposite. With GPU enabled there is lag when trying to move the spot tool and with GPU disabled it is lag free and pretty much the same behavior with the Noise Reduction panel sliders. On my system using a 2560x1440 monitor and 21Mp raw files I get better "overall performance" with GPU disabled. I would expect your system with GPU enabled or disabled to have snappy Develop slider and Spot Removal tool operation.

What specific monitors are you using and with what camera files? Are the monitors calibrated and if so are you creating ICC Version 2 profiles and not Version 4?

jwArtWorks
Known Participant
June 4, 2016

Hi,

Sorry it has been 4 weeks since I responded but I had a considerable amount of trouble with the HP Z640 forcing me to finally return it and get something different. The detailed specs of my new unit (shown from Lightroom) are below but basically this is the sort version of the specs.

Core i7 6-Core 3.3Ghz Overclocked to 3.88Ghz - Supercooled, 64Gb 2400msec Ram, 1.2Tb Intel NvMe Super Speed PCIe SSD Main Drive (software and catalogs here), 1Tb Second Internal Drive (Photoshop Scratch here), GeForce GTX TITAN X/PCIe/SSE2 with 12Mb Onboard Graphics Memory.

Using the speed tests available at http://cpu.userbenchmark.com/ , my pc shows up as a nuclear submarine in their speed vernacular with these details:

UserBenchmarks: Desk 110%, Game 112%, Work 97%

CPU: Intel Core i7-5820K - 94%

GPU: Nvidia GTX Titan X - 119.4%

SSD: Intel 750 Series NVMe PCIe 1.2TB - 186.8%

SSD: Intel SSDSC2KW010X6 1TB - 97.5%

USB: Drobo 5D 17.5TB - 77.2%

USB: USB 3.0 Drobo S - Gen 2 9TB - 47.9%

RAM: Unknown 8x8GB - 114%

MBD: Asus X99-A/USB 3.1

It may be interesting to note that everything I do is extremely fast including Lightroom but Lightroom is not noticeably faster in Develop mode than before. These improvements are obviously related to the faster speed of the computer and have nothing to do with the GPU. The entire reason for getting the new computer was to take advantage of the new GPU enhancements in Develop mode since my old Z210 workstation was limited as it only had PCIe V2 and all the new and more powerful Graphics cards are V3. The reason I switched this newest workstation from the M4000 to the GEforce Titan is some research showing that this card was indeed faster in almost every test and since I was trying to maximize speed I switched Graphics cards.

For monitors, I have an LG 27EA63 IPS LED display and an LG 27M67HQ IPS LED display - both connected via HDMI and Yes, both monitors are calibrated using Spyder 5 Elite.

Yes, of course I am aware that GPU is only enabled in Develop and only for certain functions and they seem to work okay except for zooming which I would think should be instant but there is a noticeable delay after zooming until the display clears with full resolution. If I was using 4k displays, I am guessing this would be much worse.. Anyway, I await the next version of Lightroom hoping they will continue to improve the code for the GPU enhancement especially for the rest of the editing tools.

Specs shown from Lightroom:

Lightroom version: CC 2015.5.1 [ 1073342 ]

License: Creative Cloud

Operating system: Windows 10

Version: 10.0

Application architecture: x64

System architecture: x64

Logical processor count: 12

Processor speed: 3.3 GHz

Built-in memory: 65447.0 MB

Real memory available to Lightroom: 65447.0 MB

Real memory used by Lightroom: 535.1 MB (0.8%)

Virtual memory used by Lightroom: 477.2 MB

Memory cache size: 239.8 MB

Maximum thread count used by Camera Raw: 12

Camera Raw SIMD optimization: SSE2,AVX,AVX2

System DPI setting: 96 DPI

Desktop composition enabled: Yes

Displays: 1) 1920x1080, 2) 1920x1080

Input types: Multitouch: No, Integrated touch: No, Integrated pen: Yes, External touch: No, External pen: Yes, Keyboard: No

Graphics Processor Info:

GeForce GTX TITAN X/PCIe/SSE2

Check OpenGL support: Passed

Vendor: NVIDIA Corporation

Version: 3.3.0 NVIDIA 368.22

Renderer: GeForce GTX TITAN X/PCIe/SSE2

LanguageVersion: 3.30 NVIDIA via Cg compiler

Application folder: C:\Program Files\Adobe\Adobe Lightroom

Library Path: P:\Lightroom\Drive G - Assorted\Drive G - Assorted-2-2-2.lrcat

Settings Folder: C:\Users\joelw\AppData\Roaming\Adobe\Lightroom

Installed Plugins:

1) Canon Tether Plugin

2) ColorChecker Passport

3) Facebook

4) Flickr

5) Leica Tether Plugin

6) Nikon Tether Plugin

7) ON1 Effects 10

8) ON1 Enhance 10

9) ON1 Photo 10

10) ON1 Portrait 10

11) ON1 Resize 10

Config.lua flags: None

Adapter #1: Vendor : 10de

  Device : 17c2

  Subsystem : 29903842

  Revision : a1

  Video Memory : 12175

Adapter #2: Vendor : 1414

  Device : 8c

  Subsystem : 0

  Revision : 0

  Video Memory : 0

AudioDeviceIOBlockSize: 1024

AudioDeviceName: Speakers (2- Realtek High Definition Audio)

AudioDeviceNumberOfChannels: 2

AudioDeviceSampleRate: 44100

Build: LR5x102

Direct2DEnabled: false

GPUDevice: not available

OGLEnabled: true

rickm1970
Known Participant
April 6, 2016

I am curious if there has been anything done on GPU side since last May when all these comments were made? I have been thinking about getting a Nvidia GTX 960 with 4 GB of VRAM, but I am not sure if it is worth it. I have an old Nvidia card that can be enabled but it just slows everything down now that I have updated everything else on my PC. With my old processor, it was faster with the GPU enabled.  I have looked through the release notes for CC for the last few releases and have not seen anything that indicates there has been work done on the GPU issues that many are having. I love Lightroom and shoot sports, so I it is not uncommon to sift through 1000 images at a time.

Todd Shaner
Legend
April 6, 2016

Improvements have been added in each LR update to the point that my modest Nvidia Quadro 600 GPU actually speeds up processing in the Develop module with my i7-860 processor Windows 7 system. It slows down preview rendering in the Develop module when switching images, but everything else is faster so still beneficial.

rickm1970 wrote:

I love Lightroom and shoot sports, so I it is not uncommon to sift through 1000 images at a time.

When reviewing images you should be using the Library module and NOT the Develop module. Please see this post for suggestions:

Re: How to speed up 1:1 ?

rickm1970
Known Participant
April 6, 2016

Thanks for getting back to me. If I am looking at spending 200-250 on a card and do not plan on doing any gaming, I am better off getting a GTX960 with 4GB of DDR5 or a K620 Quadro with 2GB DDR3. I only use Lightroom and Photoshop and 1080i. I do plan to update my monitors to 4k soon, though. I may look at the K1200 as well, but that is slightly out of my price range. The rest of my system newer, i7, solid state OS drive and 16GB of ram.

Participant
March 26, 2016

Hi,

I look lucky dealing with my config: things work quiet well as far as I don't use big brushes for hours (then my RAM seems to be full and a relaunch make things better)

It is a i3-3570 (oveclocked form 3.4 to 4.1Ghz), with an old GTX285 (It could be an even older HDxxxx graphic card as I disable GPU)

I have LR6.4,  8Go of DDR3, Dual screen (24 an 19'), the catalog contains 120000 images :-) and I work with nikon D600 24Mpix RAW

What maybe makes things work:

--> standard preview size for catalog is 1680. It is ok to have a good vision of a pict

--> I make standard preview at import, and if I need a close look at sharpness for a group of picts, I run a 1-1 preview and go back when done (it takes 8 sec per image) then work go fast.

--> dealing with disks:

- LR as Windows OS are on a same SSD (between 100 and 400MB/s seq)

- Picts on a RAID 5 drive (100/60 Read/write MBs seq)

- catalog and previews on another dedicated SSD

Now even if I can work with this config, I think it is too slow to render a RAW, or when I click on "increase exposure" (I click because ther is no shortcut for that unfortunatelly)

I want to make a new config but before that, I'm looking for a Performance Monitor tool to see what is the most important for LR (RAM bandwith, CPU, Disks speed, Graphic bandwith)

I tried Windows Performance Monitor and Ressource Monitor, but i don't manage to use properly these tools

If you know a user friendly tool to show which is the weak area of a config when running tasks (Ligtroom here) I'd be gald to hear you

ciao

Participant
March 3, 2016

MadManChan2000 and Team,

I used to love LR and ACR, but like many others have said, right now it is "UNUSABLE" for adjustment and spot brushes.

I even have the fastest and most upgraded late 2015 iMac 5k with maximum amount of GPU memory and RAM that money could buy but it's still slow like heck when applying adjustment brushes and spot healing.

I know you said this was only "beginning". But that beginning was almost a year ago now. What is really beginning is me (perhaps many others) losing patience and wondering what Adobe actually prioritizes? Releasing weird features like import dialog or actually solving issues affecting users on EVERY image?

Come on, at least let us know an ETA? I just don't know how long I can deal with this before canceling and switching to other platforms.

Sean H [Seattle branch]
Known Participant
November 26, 2015

I have a new AMD 390X gnu with 8GB of memory. Can Adobe shove a ton of raw files into the VRAM for lightning fast shot-to-shot load times in the Develop module? (kinda like pre-rendering 1:1 previews for the library so it's all cache)

[ ◉"]