• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
12

GPU notes for Lightroom CC (2015)

Adobe Employee ,
Apr 25, 2015 Apr 25, 2015

Copy link to clipboard

Copied

Hi everyone,

I wanted to share some additional information regarding GPU support in Lr CC.

Lr can now use graphics processors (GPUs) to accelerate interactive image editing in Develop. A big reason that we started here is the recent development and increased availability of high-res displays, such as 4K and 5K monitors. To give you some numbers: a standard HD screen is 2 megapixels (MP), a MacBook Retina Pro 15" is 5 MP, a 4K display is 8 MP, and a 5K display is a whopping 15 MP. This means on a 4K display we need to render and display 4 times as many pixels as on a standard HD display. Using the GPU can provide a significant speedup (10x or more) on high-res displays. The bigger the screen, the bigger the win.

For example, on my test system with a 4K display, adjusting the White Balance and Exposure sliders in Lightroom 5.7 (without GPU support) is about 5 frames/second -- manageable, but choppy and hard to control. The same sliders in Lightroom 6.0 now run smoothly at 60 FPS.

So why doesn't everything feel faster?

Well, GPUs can be enormously helpful in speeding up many tasks. But they're complex and involve some tradeoffs, which I'd like to take a moment to explain.

First, rewriting software to take full advantage of GPUs is a lot of work and takes time. Especially for software like Lightroom, which offers a rich feature set developed over many years and release versions. So the first tradeoff is that, for this particular version of Lightroom, we weren't able to take advantage of the GPU to speed up everything. Given our limited time, we needed to pick and choose specific areas of Lightroom to optimize. The area that I started with was interactive image editing in Develop, and even then, I didn't manage to speed up everything yet (more on this later).

Second, GPUs are marvelous at high-speed computation, but there's some overhead. For example, it takes time to get data from the main processor (CPU) over to the GPU. In the case of high-res images and big screens, that can take a LOT of time. This means that some operations may actually take longer when using the GPU, such as the time to load the full-resolution image, and the time to switch from one image to another.

Third, GPUs aren't best for everything. For example, decompressing sequential bits of data from a file -- like most raw files, for instance -- sees little to no benefit from a GPU implementation.

Fourth, Lightroom has a sophisticated raw processing pipeline (such as tone mapping HDR images with Highlights and Shadows), and running this efficiently on a GPU requires a fairly powerful GPU. Cards that may work with in the Photoshop app itself may not necessarily work with Lightroom. While cards that are 4 to 5 years old may technically work, they may provide little to no benefit over the regular CPU when processing images in Lr, and in some cases may be slower. Higher-end GPUs from the last 2 to 3 years should work better.

So let's clear up what's currently GPU accelerated in Lr CC and what's not:

First of all, Develop is the only module that currently has GPU acceleration whatsoever. This means that other functions and modules, such as Library, Export, and Quick Develop, do not use the GPU (performance should be the same for those functions regardless of whether you have GPU enabled or disabled in the prefs).

Within Develop, most image editing controls have full GPU acceleration, including the basic and tone panel, panning and zooming, crop and straighten, lens corrections, gradients, and radial filter. Some controls, such as local brush adjustments and spot clone/heal, do not -- at least, not yet.

While the above description may be disappointing to some of you, let's be clear: This is the beginning of the GPU story for Lightroom, not the end. The vision here is to expand our use of the GPU and other technologies over time to improve performance. I know that many photographers have been asking us for improved performance for a long time, and we're trying to respond to that. Please understand this is a big step in that direction, but it's just the first step. The rest of it will take some time.

Summary:

1. GPU support is currently available in Develop only.

2. Most (but not all) Develop controls benefit from GPU acceleration.

3. Using the GPU involves some overhead (there's no free lunch). This may make some operations take longer, such as image-to-image switching or zooming to 1:1. Newer GPUs and computer systems minimize this overhead.

4. The GPU performance improvement in Develop is more noticeable on higher-resolution displays such as 4K. The bigger the display, the bigger the win.

5. Prefer newer GPUs (faster models within the last 3 years). Lightroom may technically work on older GPUs (4 to 5 years old) but likely will not benefit much. At least 1 GB of GPU memory. 2 GB is better.

6. We're currently investigating using GPUs and other technologies to improve performance in Develop and other areas of the app going forward.

The above notes also apply to Camera Raw 9.0 for Photoshop/Bridge CC.

Eric Chan

Camera Raw Engineer

Views

175.0K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
replies 353 Replies 353
Contributor ,
Sep 19, 2016 Sep 19, 2016

Copy link to clipboard

Copied

After a lot of time for tests, I can say:

Problem: brush or clone tool slow down after x minutes work

GPU on increase this problem drastically!

But this problem is not a GPU on/off problem. Is a general problem of camera raw. But GPU on, increase the problem on my best machine drastically. Not on my weak machine. This is very strange!

I think LR and Camera Raw have problems with certain high end architectures.

Please see:

Problems with the performance of Lightroom in the develop modul

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Nov 24, 2016 Nov 24, 2016

Copy link to clipboard

Copied

I have a similar situation with GPU acceleration in Ligntroom 2015.7. If you disable the checkbox to use the GPU brakes disappear, ceases to work more comfortably. Strange how it turns out! Developers blundered again. Let's wait for the new version. Maybe everything will fall into place.

My configuration: Intel Core i5-6600K, MSI Z170A TOMAHAWK, MSI GeForce GTX 1060 GAMING X 6G, Kingston DDR4 16GB 3000Mhz HyperX Savage Black, Win7-64

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
Nov 13, 2016 Nov 13, 2016

Copy link to clipboard

Copied

Dear Madman@Adobe:

20+years ago my uVax had an ArrayCoprocessor on the Qbus for geophysical waveform processing. The software always crashed because offloading data to the MAP4000 was such an error prone thing to do. Once DEC came along with the Decstations (Cougar), we gladly junked all Arrayprocessor stuff as the new stations were double as fast as the old ones with AP.

Then came the x86/x87 thingy. Up to the Intel 386 you had to use a numerical coprocessor if you needed certain functions for speed. I think you could not compile F77 without it. Anyhow, with the 486 the coprocessor was integrated.

And now we have the GPUs.  I now some science projects being delayed, because they wanted to use the GPU but never got it to work. It will go its natural way, as the others. Shows that something new needs to happen on the CPU scene. I hope for Intel that they do not follow DEC in that..

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Apr 26, 2017 Apr 26, 2017

Copy link to clipboard

Copied

Kabylake Core i7-7700HQ+HM175

Memory: DDR IV 8GB*2

Storage: 256GB NVMe SSD (+1 TB (SATA) 7200rpm)

GeForce GTX 1060 6GB

Everything is fine, UNTIL I use an adjustment brush. After I am done painting, whatever slider I change in the adjustment brush tab, it takes ages for the first 1-2 changes and then I get an "LR is not responding" message, 10 seconds later it has applied the change and this is recurring for every modification from there on. Even after restarting LR, loading an image that has an adjustment brush modification or even editing it further is just frustrating.

I have increased the cache, updated all the drivers, deleted the preferences file, restart/uninstall/restart/re-install...

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Dec 11, 2016 Dec 11, 2016

Copy link to clipboard

Copied

MadManChan2000​

Has GPU support for local brush adjustments and clone/heal been abandoned? It's been almost 2 years since this last post and there hasn't been any updates.

Using a local adjustmnet brush or cloning/healing brush on my iMac 5k is so painfully slow.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 02, 2017 May 02, 2017

Copy link to clipboard

Copied

ADOBE Photoshop Lightroom is not responding.... Lightroom 6.10 . With IGP Intel HD everything is working. Anyone recommended drivers for AMD RX580?? I have tried to use several drivers, but no luck. At the moment I am using 17.4.4

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
May 18, 2017 May 18, 2017

Copy link to clipboard

Copied

Thanks Eric,

Just reading your article in May of 2017.  You say that writing code that will let lightroom work with high res displays takes time...  just how long are we talking?

Here in May 2017, Lightroom, (version 2015.10) does't NOT enhance workflow with 4k screens, multiple Geforece 1070's, 32Gb ram with 30 and 50Mb raw files.  Your program has slipped below the waves and is sinking fast.  The performance is so bad I am avoiding using Lightroom for anything that I don't have to.  And I don't have to use it. I've tried GPU on and off, large and small previews, and nothing seems to make the program usable, say as it was in in the early version with 1080p.  So it takes a long time, or you are just not going to bother with making it work with 4 and 5k systems? 

Photoshop on this same computer is snappy and handles the larger files on the 4k system just fine, or good enough.  Rendering 4k video takes a little bit of time but editing in PP and AE is doing ok on this machine.  So you see for someone that doesn't invent and produce editing software, I fail to see long delay in keep Lightroom viable.

It might be my Old processor,  It's an i7 2600k 4 core at 3.4ghz. ?? 

So Even if you don't have plan to fix Lightroom to work with modern systems, I have a plan to spend about 3.5k (US) on my machine to make your program viable again... or at least usable.  Do you think an i7 6950x (10 cores) and 126Gb of DDR 3000+ ram with with above video system do anything with your old code inside Lightroom? 

These are strong words, yes I know, but I am going to spend a lot so as I can use Lightroom to help me process the 1000+ 30Gb and 50Gb files that I take every month. ( with new Icat files).  Please let me know if my plan will work or should I wait for your plan.  Maybe you are dropping lightroom for a new product?  Something that will work with modern video, as it is not going away.  @ !

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 18, 2017 May 18, 2017

Copy link to clipboard

Copied

ThaiBear  schrieb

...  Do you think an i7 6950x (10 cores) ...

I hate to tell you that Lr currently also seems to have issues with more than 4 cores for some users.

Also, it won't use the vast RAM.

Let's hope that all this is improved with the next LR version.

--- Got your issue resolved? Please label the response as 'Correct Answer' to help your fellow community members find a solution to similar problems. ---

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 19, 2017 May 19, 2017

Copy link to clipboard

Copied

i have trouble with gpu now and then Lr works fine for a long time then doesnt for some reason .my main findings is conflictions with other programs .this usually happens when microsoft do a huge update or bring out a new version of windows.this has just happened again after updating to latest win 10 update which is absolutely huge.it after which my network adapter driver didnot work had to uninstall and reinstall that and now everything is fine then found office would not work ,i found that ABBYread  a program was causing conflictions .and now i see GPU acceleration is making the development panel not work .but switching of graphic acceration cures it. this is a graphics driver confliction and AMD and INTEL  have some issues in some of there drivers .i use AMD card and its great a lot of the time but the 17.3 driver does not work well at all the previous driver 17.2 is fine until now so i will try other driver releases until i get it sorted out which i will but this is the problem conflictions .so i say to all here that its not your machine just look at the progams installed as not all work well together they should but dont 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Dec 03, 2017 Dec 03, 2017

Copy link to clipboard

Copied

Using Lr Classic CC - Images are only visible in Develop mode.  In Library mode, Grid view, the grid is visible as is the information, time, date, exposure info, stars, colour mark etc but no image.  In Develop mode it is perfect but in all other views no image is visible, just the background colour of whatever colour I had marked the image.  This suddenly happened yesterday and I can't get the images back.  I recently upgraded from Lightroom 5.7 and looking back into that the same occurs.

I have the option to use the GPU checked – Invidia GeForce 650 It.  If I uncheck it the image is no longer visible in Develop mode.  If I check it again the image reappears in the Develom module but not in others. 

  Any ideas?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Advocate ,
Dec 03, 2017 Dec 03, 2017

Copy link to clipboard

Copied

Have you updated your graphics driver?

Bob Frost

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Dec 03, 2017 Dec 03, 2017

Copy link to clipboard

Copied

Yes – one of the first things I checked. Is says I have the most up to date.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Dec 03, 2017 Dec 03, 2017

Copy link to clipboard

Copied

Check your screen calibration. This often happens with a bad monitor profile.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Dec 03, 2017 Dec 03, 2017

Copy link to clipboard

Copied

Correct.  Calibrating the monitor did the job.  Thank you so much

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Dec 03, 2017 Dec 03, 2017

Copy link to clipboard

Copied

Try rebuilding previews as the library mode displays embedded previews unless standard previews are available. In Library mode, go to Library > Previews > Build Standard Size Previews and see if this changes anything.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Dec 03, 2017 Dec 03, 2017

Copy link to clipboard

Copied

I didn't need to try that.  Thanks anyway.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Sep 21, 2018 Sep 21, 2018

Copy link to clipboard

Copied

How is the state of Lightroom Classic 7.5 now and GPU acceleration. I'm considering buying a new iMac, and trying to maximize the performance of Lightroom. I intend to get it with SSD and 24 GB of RAM, but there are different levels of iMac with increasing CPU speed and GPU performance. I don't know if it's worth getting the better iMac for slightly better performance. I think the best investment is in getting the largest SSD and the slowest CPU/GPU model. Any comment or opinion?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
Sep 21, 2018 Sep 21, 2018

Copy link to clipboard

Copied

I currently have a 6-core 5GHz 8700K cpu with 64GB 3Ghz DDR4 and a 1070TI along with 4k 27" IPS. Also have a WD Black NVME 1TB stick which is insanely fast... like 3000MB/s fast.

There's a workflow that makes it all smooth kinda regardless of your setup. First, you pre-render all photos 1:1 before you sort. When you have filtered out the best, make them into Smart Previews and make sure you checkmark "use Smart Previews for development" or whatever it's called. In doing so, you are telling LR to load up small DNG "proxy" versions for you to edit. The CPU really only matters now on export. The 1070TI does speed up some development stuff like sliders and gradients, etc. But it also causes weird issues like mismatch DNG Smart Previews to the thumbnails (still an issue). I also get blackouts and whiteouts where the photo should be and I have to step away from photo to next then back to actually see the render. If I disable the GPU, none of these issues happen. Ram is really only needed for large images such as panos.

My advice: I'd go for 1) largest screen, largest res. 2) store you photos on a large spinning external and keep your internal just for OS and Apps and catalog. 3) back up that external to a second external as well as offsite. 4) if you process a ton of shots, a faster CPU might save you a second per shot but scaled out could be hours per year you'd save. 5) GPU doesn't really seem to be a huge factor. I had an old 280X GPU before and this 1070TI doesn't seem much faster. Others have reported same. Until they code the GPU better, save your money for that new lens.

[ â—‰"]

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Nov 13, 2018 Nov 13, 2018

Copy link to clipboard

Copied

Hi everyone,
I have completely new PC for Lightroom now containing i7-9700K. I want to ask - do I really need a graphic card or can I run LR using GPU in the processor? I would like to have the best performance, but I am not sure which graphic card should I choose. I do not want to spend much money for GP. Can you recommend me the best budget one, please? Thank you!

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Nov 13, 2018 Nov 13, 2018

Copy link to clipboard

Copied

Depending on the monitor resolution you use, there is not much gain with the GPU acceleration turned on, specially not with a beefy GPU as yours.

Can you share some more details on your monitor and the files you are working with?

--- Got your issue resolved? Please label the response as 'Correct Answer' to help your fellow community members find a solution to similar problems. ---

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Nov 13, 2018 Nov 13, 2018

Copy link to clipboard

Copied

I work with RAW files from D810 so usually 37 MPix, sometimes I make panoramas, then the size is much more bigger like 200–400 MP. My monitor resolution is 2560x1440 (Eizo CS270). I read that LR still do not support 10-bit monitors so I probably do not need NVidia Quatro, is it right?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Nov 13, 2018 Nov 13, 2018

Copy link to clipboard

Copied

IMHO, you will only see a low - if any - performance increase with your resolution and CPU/GPU. The internal GPU should be supported if you would like to try if the limited power of this Intel GPU brings any gain. If I were you I'd save the money and not add a dedicated GPU.

Here's some more information: Adobe Lightroom GPU Troubleshooting and FAQ

--- Got your issue resolved? Please label the response as 'Correct Answer' to help your fellow community members find a solution to similar problems. ---

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Nov 13, 2018 Nov 13, 2018

Copy link to clipboard

Copied

OK, thank you for your answer. Actually, I have my old GPU Gigabyte Radeon HD 6850 but I still have problems with newer versions of LR in Develop module so I am using version 6.14 and it works, lets say, acceptable. In newer versions it works with the first photo, then I usually press Next or Prev arrow and instead of the next photo I see only black window. Is this problem solved for newer GPU?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Nov 13, 2018 Nov 13, 2018

Copy link to clipboard

Copied

Does this also happen if you deactivate the GPU support?

--- Got your issue resolved? Please label the response as 'Correct Answer' to help your fellow community members find a solution to similar problems. ---

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Nov 13, 2018 Nov 13, 2018

Copy link to clipboard

Copied

No, it doesn't, it is an issue of the activated GPU. But in LR 6.14 it works fine with GPU support. If the GPU support is deactivated, then any work in Develop module is extremly slow (using my old PC).

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines