Skip to main content
MadManChan2000
Adobe Employee
Adobe Employee
April 26, 2015
Question

GPU notes for Lightroom CC (2015)

  • April 26, 2015
  • 84 replies
  • 195458 views

Hi everyone,

I wanted to share some additional information regarding GPU support in Lr CC.

Lr can now use graphics processors (GPUs) to accelerate interactive image editing in Develop. A big reason that we started here is the recent development and increased availability of high-res displays, such as 4K and 5K monitors. To give you some numbers: a standard HD screen is 2 megapixels (MP), a MacBook Retina Pro 15" is 5 MP, a 4K display is 8 MP, and a 5K display is a whopping 15 MP. This means on a 4K display we need to render and display 4 times as many pixels as on a standard HD display. Using the GPU can provide a significant speedup (10x or more) on high-res displays. The bigger the screen, the bigger the win.

For example, on my test system with a 4K display, adjusting the White Balance and Exposure sliders in Lightroom 5.7 (without GPU support) is about 5 frames/second -- manageable, but choppy and hard to control. The same sliders in Lightroom 6.0 now run smoothly at 60 FPS.

So why doesn't everything feel faster?

Well, GPUs can be enormously helpful in speeding up many tasks. But they're complex and involve some tradeoffs, which I'd like to take a moment to explain.

First, rewriting software to take full advantage of GPUs is a lot of work and takes time. Especially for software like Lightroom, which offers a rich feature set developed over many years and release versions. So the first tradeoff is that, for this particular version of Lightroom, we weren't able to take advantage of the GPU to speed up everything. Given our limited time, we needed to pick and choose specific areas of Lightroom to optimize. The area that I started with was interactive image editing in Develop, and even then, I didn't manage to speed up everything yet (more on this later).

Second, GPUs are marvelous at high-speed computation, but there's some overhead. For example, it takes time to get data from the main processor (CPU) over to the GPU. In the case of high-res images and big screens, that can take a LOT of time. This means that some operations may actually take longer when using the GPU, such as the time to load the full-resolution image, and the time to switch from one image to another.

Third, GPUs aren't best for everything. For example, decompressing sequential bits of data from a file -- like most raw files, for instance -- sees little to no benefit from a GPU implementation.

Fourth, Lightroom has a sophisticated raw processing pipeline (such as tone mapping HDR images with Highlights and Shadows), and running this efficiently on a GPU requires a fairly powerful GPU. Cards that may work with in the Photoshop app itself may not necessarily work with Lightroom. While cards that are 4 to 5 years old may technically work, they may provide little to no benefit over the regular CPU when processing images in Lr, and in some cases may be slower. Higher-end GPUs from the last 2 to 3 years should work better.

So let's clear up what's currently GPU accelerated in Lr CC and what's not:

First of all, Develop is the only module that currently has GPU acceleration whatsoever. This means that other functions and modules, such as Library, Export, and Quick Develop, do not use the GPU (performance should be the same for those functions regardless of whether you have GPU enabled or disabled in the prefs).

Within Develop, most image editing controls have full GPU acceleration, including the basic and tone panel, panning and zooming, crop and straighten, lens corrections, gradients, and radial filter. Some controls, such as local brush adjustments and spot clone/heal, do not -- at least, not yet.

While the above description may be disappointing to some of you, let's be clear: This is the beginning of the GPU story for Lightroom, not the end. The vision here is to expand our use of the GPU and other technologies over time to improve performance. I know that many photographers have been asking us for improved performance for a long time, and we're trying to respond to that. Please understand this is a big step in that direction, but it's just the first step. The rest of it will take some time.

Summary:

1. GPU support is currently available in Develop only.

2. Most (but not all) Develop controls benefit from GPU acceleration.

3. Using the GPU involves some overhead (there's no free lunch). This may make some operations take longer, such as image-to-image switching or zooming to 1:1. Newer GPUs and computer systems minimize this overhead.

4. The GPU performance improvement in Develop is more noticeable on higher-resolution displays such as 4K. The bigger the display, the bigger the win.

5. Prefer newer GPUs (faster models within the last 3 years). Lightroom may technically work on older GPUs (4 to 5 years old) but likely will not benefit much. At least 1 GB of GPU memory. 2 GB is better.

6. We're currently investigating using GPUs and other technologies to improve performance in Develop and other areas of the app going forward.

The above notes also apply to Camera Raw 9.0 for Photoshop/Bridge CC.

Eric Chan

Camera Raw Engineer

This topic has been closed for replies.

84 replies

Participating Frequently
November 13, 2018

Hi everyone,
I have completely new PC for Lightroom now containing i7-9700K. I want to ask - do I really need a graphic card or can I run LR using GPU in the processor? I would like to have the best performance, but I am not sure which graphic card should I choose. I do not want to spend much money for GP. Can you recommend me the best budget one, please? Thank you!

F. McLion
Community Expert
Community Expert
November 13, 2018

Depending on the monitor resolution you use, there is not much gain with the GPU acceleration turned on, specially not with a beefy GPU as yours.

Can you share some more details on your monitor and the files you are working with?

--- Got your issue resolved? Please label the response as 'Correct Answer' to help your fellow community members find a solution to similar problems. ---
Participating Frequently
November 13, 2018

I work with RAW files from D810 so usually 37 MPix, sometimes I make panoramas, then the size is much more bigger like 200–400 MP. My monitor resolution is 2560x1440 (Eizo CS270). I read that LR still do not support 10-bit monitors so I probably do not need NVidia Quatro, is it right?

Known Participant
September 21, 2018

How is the state of Lightroom Classic 7.5 now and GPU acceleration. I'm considering buying a new iMac, and trying to maximize the performance of Lightroom. I intend to get it with SSD and 24 GB of RAM, but there are different levels of iMac with increasing CPU speed and GPU performance. I don't know if it's worth getting the better iMac for slightly better performance. I think the best investment is in getting the largest SSD and the slowest CPU/GPU model. Any comment or opinion?

Sean H [Seattle branch]
Known Participant
September 21, 2018

I currently have a 6-core 5GHz 8700K cpu with 64GB 3Ghz DDR4 and a 1070TI along with 4k 27" IPS. Also have a WD Black NVME 1TB stick which is insanely fast... like 3000MB/s fast.

There's a workflow that makes it all smooth kinda regardless of your setup. First, you pre-render all photos 1:1 before you sort. When you have filtered out the best, make them into Smart Previews and make sure you checkmark "use Smart Previews for development" or whatever it's called. In doing so, you are telling LR to load up small DNG "proxy" versions for you to edit. The CPU really only matters now on export. The 1070TI does speed up some development stuff like sliders and gradients, etc. But it also causes weird issues like mismatch DNG Smart Previews to the thumbnails (still an issue). I also get blackouts and whiteouts where the photo should be and I have to step away from photo to next then back to actually see the render. If I disable the GPU, none of these issues happen. Ram is really only needed for large images such as panos.

My advice: I'd go for 1) largest screen, largest res. 2) store you photos on a large spinning external and keep your internal just for OS and Apps and catalog. 3) back up that external to a second external as well as offsite. 4) if you process a ton of shots, a faster CPU might save you a second per shot but scaled out could be hours per year you'd save. 5) GPU doesn't really seem to be a huge factor. I had an old 280X GPU before and this 1070TI doesn't seem much faster. Others have reported same. Until they code the GPU better, save your money for that new lens.

[ ◉"]
Participant
December 3, 2017

Using Lr Classic CC - Images are only visible in Develop mode.  In Library mode, Grid view, the grid is visible as is the information, time, date, exposure info, stars, colour mark etc but no image.  In Develop mode it is perfect but in all other views no image is visible, just the background colour of whatever colour I had marked the image.  This suddenly happened yesterday and I can't get the images back.  I recently upgraded from Lightroom 5.7 and looking back into that the same occurs.

I have the option to use the GPU checked – Invidia GeForce 650 It.  If I uncheck it the image is no longer visible in Develop mode.  If I check it again the image reappears in the Develom module but not in others. 

  Any ideas?

Inspiring
December 3, 2017

Have you updated your graphics driver?

Bob Frost

Participant
December 3, 2017

Yes – one of the first things I checked. Is says I have the most up to date.

Participant
May 19, 2017

i have trouble with gpu now and then Lr works fine for a long time then doesnt for some reason .my main findings is conflictions with other programs .this usually happens when microsoft do a huge update or bring out a new version of windows.this has just happened again after updating to latest win 10 update which is absolutely huge.it after which my network adapter driver didnot work had to uninstall and reinstall that and now everything is fine then found office would not work ,i found that ABBYread  a program was causing conflictions .and now i see GPU acceleration is making the development panel not work .but switching of graphic acceration cures it. this is a graphics driver confliction and AMD and INTEL  have some issues in some of there drivers .i use AMD card and its great a lot of the time but the 17.3 driver does not work well at all the previous driver 17.2 is fine until now so i will try other driver releases until i get it sorted out which i will but this is the problem conflictions .so i say to all here that its not your machine just look at the progams installed as not all work well together they should but dont 

Participating Frequently
May 18, 2017

Thanks Eric,

Just reading your article in May of 2017.  You say that writing code that will let lightroom work with high res displays takes time...  just how long are we talking?

Here in May 2017, Lightroom, (version 2015.10) does't NOT enhance workflow with 4k screens, multiple Geforece 1070's, 32Gb ram with 30 and 50Mb raw files.  Your program has slipped below the waves and is sinking fast.  The performance is so bad I am avoiding using Lightroom for anything that I don't have to.  And I don't have to use it. I've tried GPU on and off, large and small previews, and nothing seems to make the program usable, say as it was in in the early version with 1080p.  So it takes a long time, or you are just not going to bother with making it work with 4 and 5k systems? 

Photoshop on this same computer is snappy and handles the larger files on the 4k system just fine, or good enough.  Rendering 4k video takes a little bit of time but editing in PP and AE is doing ok on this machine.  So you see for someone that doesn't invent and produce editing software, I fail to see long delay in keep Lightroom viable.

It might be my Old processor,  It's an i7 2600k 4 core at 3.4ghz. ?? 

So Even if you don't have plan to fix Lightroom to work with modern systems, I have a plan to spend about 3.5k (US) on my machine to make your program viable again... or at least usable.  Do you think an i7 6950x (10 cores) and 126Gb of DDR 3000+ ram with with above video system do anything with your old code inside Lightroom? 

These are strong words, yes I know, but I am going to spend a lot so as I can use Lightroom to help me process the 1000+ 30Gb and 50Gb files that I take every month. ( with new Icat files).  Please let me know if my plan will work or should I wait for your plan.  Maybe you are dropping lightroom for a new product?  Something that will work with modern video, as it is not going away.  @ !

F. McLion
Community Expert
Community Expert
May 18, 2017

ThaiBear  schrieb

...  Do you think an i7 6950x (10 cores) ...

I hate to tell you that Lr currently also seems to have issues with more than 4 cores for some users.

Also, it won't use the vast RAM.

Let's hope that all this is improved with the next LR version.

--- Got your issue resolved? Please label the response as 'Correct Answer' to help your fellow community members find a solution to similar problems. ---
Participant
May 2, 2017

ADOBE Photoshop Lightroom is not responding.... Lightroom 6.10 . With IGP Intel HD everything is working. Anyone recommended drivers for AMD RX580?? I have tried to use several drivers, but no luck. At the moment I am using 17.4.4

Participating Frequently
December 12, 2016

MadManChan2000

Has GPU support for local brush adjustments and clone/heal been abandoned? It's been almost 2 years since this last post and there hasn't been any updates.

Using a local adjustmnet brush or cloning/healing brush on my iMac 5k is so painfully slow.

Inspiring
November 13, 2016

Dear Madman@Adobe:

20+years ago my uVax had an ArrayCoprocessor on the Qbus for geophysical waveform processing. The software always crashed because offloading data to the MAP4000 was such an error prone thing to do. Once DEC came along with the Decstations (Cougar), we gladly junked all Arrayprocessor stuff as the new stations were double as fast as the old ones with AP.

Then came the x86/x87 thingy. Up to the Intel 386 you had to use a numerical coprocessor if you needed certain functions for speed. I think you could not compile F77 without it. Anyhow, with the 486 the coprocessor was integrated.

And now we have the GPUs.  I now some science projects being delayed, because they wanted to use the GPU but never got it to work. It will go its natural way, as the others. Shows that something new needs to happen on the CPU scene. I hope for Intel that they do not follow DEC in that..

Participant
April 26, 2017

Kabylake Core i7-7700HQ+HM175

Memory: DDR IV 8GB*2

Storage: 256GB NVMe SSD (+1 TB (SATA) 7200rpm)

GeForce GTX 1060 6GB

Everything is fine, UNTIL I use an adjustment brush. After I am done painting, whatever slider I change in the adjustment brush tab, it takes ages for the first 1-2 changes and then I get an "LR is not responding" message, 10 seconds later it has applied the change and this is recurring for every modification from there on. Even after restarting LR, loading an image that has an adjustment brush modification or even editing it further is just frustrating.

I have increased the cache, updated all the drivers, deleted the preferences file, restart/uninstall/restart/re-install...

Known Participant
September 19, 2016

After a lot of time for tests, I can say:

Problem: brush or clone tool slow down after x minutes work

GPU on increase this problem drastically!

But this problem is not a GPU on/off problem. Is a general problem of camera raw. But GPU on, increase the problem on my best machine drastically. Not on my weak machine. This is very strange!

I think LR and Camera Raw have problems with certain high end architectures.

Please see:

Problems with the performance of Lightroom in the develop modul

Vovchan73
Known Participant
November 24, 2016

I have a similar situation with GPU acceleration in Ligntroom 2015.7. If you disable the checkbox to use the GPU brakes disappear, ceases to work more comfortably. Strange how it turns out! Developers blundered again. Let's wait for the new version. Maybe everything will fall into place.

My configuration: Intel Core i5-6600K, MSI Z170A TOMAHAWK, MSI GeForce GTX 1060 GAMING X 6G, Kingston DDR4 16GB 3000Mhz HyperX Savage Black, Win7-64

Known Participant
September 17, 2016

Hello, I have big problems with the performance of Lightroom in the develop modul

Situation on my workstation:

Hardware

  • Microsoft Windows 10 Pro, 64 bit
  • CELSIUS M720 POWER
  • Intel(R) Xeon(R) CPU E5-1650 0 @ 3.20GHz, 3201 MHz, 6 Kern(e), 12 logische(r) Prozessor(en)
  • (RAM) 32,0 GB
  • Grafic-Card NVIDIA Quadro K5000 (driver 369.09)

Sysinfo from Lightroom:

  • Lightroom 2015.6.1 
  • 200 RAW photo of a Nikon D5100, small catalog
  • Catalog on first HD (SSD)
  • Photos on second HD
  • Version von Lightroom: CC 2015.6.1 [ 1083169 ]
  • Lizenz: Creative Cloud
  • Betriebssystem: Windows 10
  • Version: 10.0
  • Anwendungsarchitektur: x64
  • Systemarchitektur: x64
  • Anzahl logischer Prozessoren: 12
  • Prozessorgeschwindigkeit: 3,1 GHz
  • Integrierter Speicher: 32684,2 MB
  • Für Lightroom verfügbarer phys. Speicher: 32684,2 MB
  • Von Lightroom verwendeter phys. Speicher: 3050,7 MB (9,3%)
  • Von Lightroom verwendeter virtueller Speicher: 3641,4 MB
  • Cache-Speichergröße: 1779,6 MB
  • Maximale Anzahl Threads, die Camera Raw verwendet: 12
  • Camera Raw SIMD-Optimierung: SSE2,AVX
  • DPI-Einstellung des Systems: 96 DPI
  • Desktop-Komposition aktiviert: Ja
  • Monitore/Anzeigegeräte: 1) 2560x1440, 2) 1920x1200
  • Eingabetypen: Multitouch: Nein, integrierte Toucheingabe: Nein, integrierter Stift: Nein, externe Toucheingabe: Nein, externer Stift: Nein, Tastatur: Nein

  • Informationen zum Grafikprozessor:
  • Quadro K5000/PCIe/SSE2

  • Check OpenGL support: Passed
  • Vendor: NVIDIA Corporation
  • Version: 3.3.0 NVIDIA 369.09
  • Renderer: Quadro K5000/PCIe/SSE2
  • LanguageVersion: 3.30 NVIDIA via Cg compiler

Start Lightroom (GPU is on):

When I work in the develop modul with the adjustment brush, then after the start of lightroom the brush is normal fast.  But after intense use of the brush, circa after 3 photos, the brush and also lightroom in general becomes extremely slow. Even if I wait a bit and let rest the workstation, the speed not comes back. I must restart Lightroom. Then this situation begins anew.

This problem arises not only with the brush, it also comes with other operations in the develop modul and also when I only step thought my photos. However, with the brush the problem is fastest reproducibly.

Very strange:

My workstation is never busy. The CPU have maximal peaks of 45 %, the GPU maximal peak of 8% (I can see it with an Nvidia utilization tool), RAM maximal 50%, all HDs have no stress. But, Lightroom is terrible slow. When I draw a line with the brush, then I must wait from 1 – 3 sec to see the changes.

With GPU off:

The same thing, but the problem comes later.

What I have already been tested:

Different Cache Settings in Lightroom: no effect

Update grafic diver and chip set to the last version: no effect

Clean install of Lightroom 2015.6.1: no effect

Test with the same photos with a new catalog on a laptop (Intel Core i7-5500 CPU @ 2,4 GHz, 8 GB Ram, Win10) with lightroom 2015.6.1: the CPU is overtaxed, but if I let rest the laptop, the I can continue to work normally fast again (without restart of Lightroom). On the laptop the problem does not exists.

Test with the same photos with a new catalog on a old PC (Intel Quad Core Q6600 2,4 GHz with 8 GB RAM, Win10) with Lightroom 4.4: the CPU occasionally reaches 100%, but no problem, the brush is always super fast!

I do not know what I can do now..

Todd Shaner
Legend
September 17, 2016

Frage12 wrote:

  • Monitore/Anzeigegeräte: 1) 2560x1440, 2) 1920x1200

Are you running LR in dual-display mode with 'Second Window' enabled? If so try running LR on a single display with no 'Second Window.'

Known Participant
September 17, 2016

Thank you for your answer.

Yes, I use two displays. However, in Lightroom I use only the main Display 2560x1440. I mean, in Lightroom I have the symbol for the second Windows always off.

However, now I tested also with the second display unplugged and restart of the PC. The Problem remain.

Also when I work over Remote Desktop from a Laptop to this Workstation and with both displays power off the problem remain.