• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
12

GPU notes for Lightroom CC (2015)

Adobe Employee ,
Apr 25, 2015 Apr 25, 2015

Copy link to clipboard

Copied

Hi everyone,

I wanted to share some additional information regarding GPU support in Lr CC.

Lr can now use graphics processors (GPUs) to accelerate interactive image editing in Develop. A big reason that we started here is the recent development and increased availability of high-res displays, such as 4K and 5K monitors. To give you some numbers: a standard HD screen is 2 megapixels (MP), a MacBook Retina Pro 15" is 5 MP, a 4K display is 8 MP, and a 5K display is a whopping 15 MP. This means on a 4K display we need to render and display 4 times as many pixels as on a standard HD display. Using the GPU can provide a significant speedup (10x or more) on high-res displays. The bigger the screen, the bigger the win.

For example, on my test system with a 4K display, adjusting the White Balance and Exposure sliders in Lightroom 5.7 (without GPU support) is about 5 frames/second -- manageable, but choppy and hard to control. The same sliders in Lightroom 6.0 now run smoothly at 60 FPS.

So why doesn't everything feel faster?

Well, GPUs can be enormously helpful in speeding up many tasks. But they're complex and involve some tradeoffs, which I'd like to take a moment to explain.

First, rewriting software to take full advantage of GPUs is a lot of work and takes time. Especially for software like Lightroom, which offers a rich feature set developed over many years and release versions. So the first tradeoff is that, for this particular version of Lightroom, we weren't able to take advantage of the GPU to speed up everything. Given our limited time, we needed to pick and choose specific areas of Lightroom to optimize. The area that I started with was interactive image editing in Develop, and even then, I didn't manage to speed up everything yet (more on this later).

Second, GPUs are marvelous at high-speed computation, but there's some overhead. For example, it takes time to get data from the main processor (CPU) over to the GPU. In the case of high-res images and big screens, that can take a LOT of time. This means that some operations may actually take longer when using the GPU, such as the time to load the full-resolution image, and the time to switch from one image to another.

Third, GPUs aren't best for everything. For example, decompressing sequential bits of data from a file -- like most raw files, for instance -- sees little to no benefit from a GPU implementation.

Fourth, Lightroom has a sophisticated raw processing pipeline (such as tone mapping HDR images with Highlights and Shadows), and running this efficiently on a GPU requires a fairly powerful GPU. Cards that may work with in the Photoshop app itself may not necessarily work with Lightroom. While cards that are 4 to 5 years old may technically work, they may provide little to no benefit over the regular CPU when processing images in Lr, and in some cases may be slower. Higher-end GPUs from the last 2 to 3 years should work better.

So let's clear up what's currently GPU accelerated in Lr CC and what's not:

First of all, Develop is the only module that currently has GPU acceleration whatsoever. This means that other functions and modules, such as Library, Export, and Quick Develop, do not use the GPU (performance should be the same for those functions regardless of whether you have GPU enabled or disabled in the prefs).

Within Develop, most image editing controls have full GPU acceleration, including the basic and tone panel, panning and zooming, crop and straighten, lens corrections, gradients, and radial filter. Some controls, such as local brush adjustments and spot clone/heal, do not -- at least, not yet.

While the above description may be disappointing to some of you, let's be clear: This is the beginning of the GPU story for Lightroom, not the end. The vision here is to expand our use of the GPU and other technologies over time to improve performance. I know that many photographers have been asking us for improved performance for a long time, and we're trying to respond to that. Please understand this is a big step in that direction, but it's just the first step. The rest of it will take some time.

Summary:

1. GPU support is currently available in Develop only.

2. Most (but not all) Develop controls benefit from GPU acceleration.

3. Using the GPU involves some overhead (there's no free lunch). This may make some operations take longer, such as image-to-image switching or zooming to 1:1. Newer GPUs and computer systems minimize this overhead.

4. The GPU performance improvement in Develop is more noticeable on higher-resolution displays such as 4K. The bigger the display, the bigger the win.

5. Prefer newer GPUs (faster models within the last 3 years). Lightroom may technically work on older GPUs (4 to 5 years old) but likely will not benefit much. At least 1 GB of GPU memory. 2 GB is better.

6. We're currently investigating using GPUs and other technologies to improve performance in Develop and other areas of the app going forward.

The above notes also apply to Camera Raw 9.0 for Photoshop/Bridge CC.

Eric Chan

Camera Raw Engineer

Views

175.5K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
replies 353 Replies 353
Guest
May 08, 2015 May 08, 2015

Copy link to clipboard

Copied

Still getting the spinning beachball and slowness even with GPU feature turned off. 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 09, 2015 May 09, 2015

Copy link to clipboard

Copied

I've just tested LR6 with the new HDR and Panorama processes on my iMac (details below)

8 shot panorama merge start to finish with Graphics Processor turned on 1 min 5 secs, turned off 55 secs - so faster without the GP

5 shot HDR merge 1:02 with GP turned on and off - GP makes no difference

Model Name: iMac

  Processor Name: Intel Core i7

  Processor Speed: 3.4 GHz

  Memory: 16 GB

Chipset Model: AMD Radeon HD 6970M

  VRAM (Total): 2048 MB

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
May 09, 2015 May 09, 2015

Copy link to clipboard

Copied

Read OP. Gou only affects basic sliders. 

---

[ â—‰"]

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
May 09, 2015 May 09, 2015

Copy link to clipboard

Copied

As stated above the GPU is not enabled for the photomerge features.

Regarding the time difference: if you do several merges of the same images you will notice the time will vary substantially. This is down to how LR merges the images. Each time it will analyse the photos slightly differently and merge them slightly differently. Sometimes the path it takes will be more efficient, sometimes less. I've had differences in excess of 40s.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 09, 2015 May 09, 2015

Copy link to clipboard

Copied

I have encountered all of the same problems with speed in Develop, especially with spot/heal and radial -- in addition, when I try to merge to panorama I get a message: "unknown errors occurred" -- very helpful!!

Lightroom version:  CC 2015 [1014445]

License: Creative Cloud

Operating system: Mac OS 10

Version: 10.10 [3]

Application architecture: x64

Logical processor count: 8

Processor speed: 2.3 GHz

Built-in memory: 16,384.0 MB

Real memory available to Lightroom: 16,384.0 MB

Real memory used by Lightroom: 473.3 MB (2.8%)

Virtual memory used by Lightroom: 1,212.6 MB

Memory cache size: 258.7 MB

Maximum thread count used by Camera Raw: 4

Camera Raw SIMD optimization: SSE2,AVX

Displays: 1) 2560x1440

Graphics Processor Info:

Intel HD Graphics 4000 OpenGL Engine

Check OpenGL support: Passed

Vendor: Intel Inc.

Version: 4.1 INTEL-10.6.20

Renderer: Intel HD Graphics 4000 OpenGL Engine

LanguageVersion: 4.10

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
May 09, 2015 May 09, 2015

Copy link to clipboard

Copied

it would be much faster if LR would prerender the next some images BEFORE opening them! or BEFORE zooming in!

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 09, 2015 May 09, 2015

Copy link to clipboard

Copied

I have a six month old Mac Pro with a AMD FirePro D500 3072 MB GPU and almost all of the develop functions are significantly slower when I check use the GPU for acceleration.  HUGE disappointment with Lightroom CC!

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
May 09, 2015 May 09, 2015

Copy link to clipboard

Copied

A question to the people that have speed problems with GPU enabled: Did you try GPU acceleration also with ACR9? What about speed there?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 09, 2015 May 09, 2015

Copy link to clipboard

Copied

Since I¹m using Lightroom as my main processing platform, I don¹t use ACR

natively. Besides, my understanding is that the implementation in Lightroom

is ACR integrated into Lightroom.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
May 09, 2015 May 09, 2015

Copy link to clipboard

Copied

> A question to the people that have speed problems with GPU enabled: Did you try GPU acceleration also with ACR9? What about speed there?

the same miserable situation - some operations do work faster (for example simple panning), some do work slower (for example zooming) and switching GPU off makes it work slower than previous ACR8 (a simple example - with WB tool I can draw multiple rectangles selecting areas for WB before it actually happens - the lag in speed of execution of WB action in code in this manner is huge)... the product was not really ready for release in terms of code... I am on Clevo notebook = PC/Win8.1x64 running on i7-4810mq, 32gb RAM, Nvidia GTX870M - that is way faster machine GPU-wise than the best macbookpro out there (which has GT750M), just for comparison purporses...

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
May 09, 2015 May 09, 2015

Copy link to clipboard

Copied

Wonder what Adobe used to test this? It's amazing the silence from them to this matter. Even though this thread was initiated by an Adobe employee, all the negative feedback should call to some action, but not a word.

Sent from my iPad

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
May 09, 2015 May 09, 2015

Copy link to clipboard

Copied

I think these issues have arisen because of a number of factors.  Adobe have tried to solve a big problem for some users, basically high end macs with 4/5K displays.  Unfortunately these are high profile users, including Adobe staff and product evangelists which would drive the need to "fix" the movement of huge numbers of pixels on the screen.  I say unfortunately because these 4/5K users form a tiny % of the installed user base and Adobe.  I do think it was a wrong project management decision to prioritise a tiny fraction of 1% of the user base whilst annoying the majority.

 

We all now that LRCC was delayed and there have been comments of "a troubled birth".  This is probably why GPU acceleration is only part implemented.  Marketing time pressures probably meant that LRCC was released before the poor engineers (who take the flak) wanted it to go out.

These issues were then probably compounded by the fact that no public beta was conducted.  I think Adobe have given up on public betas because with CC all software is in constant development and being Beta tested by users.  This is highly beneficial to Adobe and is an advantage of the CC model.  Once LRCC is over these initial problems these kind of issues should not happen again as any update to CC can be reset if significant issues like these were found.

I think we just have to "suck it up" until the LR engineers can sort out the problems.  I believe that they are a very dedicated, hard working team and will deliver as soon as it is humanly possible.  I am waiting for the product to be fixed before upgrading, LR5 works well and the new features aren't actually that compelling to bother with all of the grief.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
May 09, 2015 May 09, 2015

Copy link to clipboard

Copied

You are absolutely right, but you forget that people that is on the CC model pays monthly to use the software, and is having a software that is mostly unusable, and I'm not getting any discount for not being able to upgrade, and I'm also running, or try to run, the software on machines that are with monitors working at very high resolution with multi $K GPUs and it does not deliver.

Sure we should be patience. But this is a story that repeats over and over and over on each new mayor release of Lightroom, and it does not happens with Photoshop or After Effects, seems that probably Ligthroom have the lowest priority.

Anyway, not trying to bash anyone, I know how it works, engineers do the best they can with the time and budget they have. And it is usually a department of marketing that make the decisions of when things go out. And that deadline can not be missed. So I'm not blaming the development team.

Sent from the Moon

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
May 09, 2015 May 09, 2015

Copy link to clipboard

Copied

> advantage of the CC mode

it has nothing to do with subscription model - the company might as well have such constant development and beta-alpha-gamma testing with any other model... but for marketing purposes (and to further benefit cash flow by forcing subscription model) it is beneficial to plant in people minds a false idea that it is somehow only possible with so called "CC" model

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 09, 2015 May 09, 2015

Copy link to clipboard

Copied

Slowly but surely I am running out of arguments why my digital operator should use Lightroom instead of the direct professional competitor. They at least show what a GPU acceleration can actually do. This is possibly on of the most disappointing major releases of this software in a while. If I paid for a full upgrade I'd ask for my money back at this point.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guest
May 09, 2015 May 09, 2015

Copy link to clipboard

Copied

On my 2015 mb pro retina 2.7 8gb 256ssd after photos have been imported the develop module is very slow and lags.  I've closed and quit all other apps, opened and closed lightroom, turned off sync with LR mobile, face detection and address lookup thinking this would help.  No change.  Just flipping through photos in Develop module to crop takes seconds for each to load.

This is interesting,

Says real memory available to lightroom 8,192mb

Real memory used by LR 1.919mb now up to 2.6gb

Virtual memory used by LR 11,148mb or 11.93 gb and growing, now up to 13.33gb

% cpu 252.80 %

Doesn't seem correct

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 09, 2015 May 09, 2015

Copy link to clipboard

Copied

For all of you having speed issues, i'll suggest again that for some reason, going to catalog settings and changing the preview size to the smallest (1024px, and i selected medium) has a dramatic improvement in overall image switching, develop stuff, zooming in, 1to1 preview, etc. Maybe it's a caching issue, or the preview setting moves more data from the CPU to GPU??? IDK, but LR is usable for me again. It's still slower, just not stuck in glue. I should add that develop sliders are indeed real time. And brush strokes are slower than LR 5.7.1

Mid 2012 MBP, 2.7Ghz i7, 16GB RAM, 1TB SSD (catalog and previews), 1TB internal HDD for images, nVidia 650M 1024MB, running Yosemite.

The catalog is about 9GB, containing about 600K images. (used to be 12GB, but i deleted all of the history for every photo, which saved quite a bit of space)

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
May 09, 2015 May 09, 2015

Copy link to clipboard

Copied

> For all of you having speed issues, i'll suggest again that for some reason, going to catalog settings and changing the preview size to the smallest (1024px, and i selected medium) has a dramatic improvement in overall image switching, develop stuff, zooming in, 1to1 preview, etc.what

dude - I am using ACR... same code, no stupid catalogs

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 09, 2015 May 09, 2015

Copy link to clipboard

Copied

Hi!
I understand that this issue is affecting different users in different ways. I have the most recent MacBook Pro, with 16GB of memory. Importing with or without the "acceleration" was impossible. So slow that it was ridiculous. I have reinstalled Lightroom 5 and I can say that going back to it was fantastic. I love the speed!  Your Lightroom CC should not even be a release candidate as it is now.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
May 09, 2015 May 09, 2015

Copy link to clipboard

Copied

slubezky wrote:

I have the most recent MacBook Pro, with 16GB of memory. Importing with or without the "acceleration" was impossible. So slow that it was ridiculous. I have reinstalled Lightroom 5 and I can say that going back to it was fantastic. I love the speed! 

Check your Catalog Preferences. LR 6/CC now uses an 'Auto' Standard Preview Size setting by default, which may be different than the setting you are using in LR5. With my 2560 x 1440 monitor there was an improvement since LR 5 only offers settings of 2,880 or 2,048. My Standard preview size went from 2,880 to the monitor's native 2,560.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 10, 2015 May 10, 2015

Copy link to clipboard

Copied

It seems to me that Adobe chose to cheat in order to add better performance to Lightroom.  Instead of writing the performance enhancements into the program, they decided hey, you know there is this GPU thing, why don't we just utilize it, therefor we don't have to do any real work in rewriting Lightroom with performance enhancements.

Here is a no brainer for some of you, not all photographers have extra cash just laying around to throw at a new computer everytime a software manufacturer such as Adobe wants to make there software only for top end computers.  GPU really?  Why, Lightroom 5 was pretty fast without GPU.  What was the thinking?  Apparently there wasn't any.  So now for the 1% that have top of the line computers that can speed along with GPU, you alienate the other 99%.....awesome job.  Who was the genius who thought this was a good idea?  Whoever he is, he should probably be fired and never allowed to work in the industry again.  Just saying.....

Oh and I am not really that mad.......(yes that was sarcasm)

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
May 11, 2015 May 11, 2015

Copy link to clipboard

Copied

Do you really think, Lightroom, Capture One and others go the "GPU-route", because they just want to avoid "any real work"? I guess, they would be glad, if they could stick to the CPU-route. GPU just introduces more possibilities for bugs, which are GPU driver related etc. They went the GPU route, because it seems to be the only way, to get a fast onscreen reaction, especially on high-res displays, which will be more common in the future anyway. I have a pretty old (Intel Q6600 and GTX 750Ti) and a very new PC (6core Haswell E and GTX 970). On both machines, the GPU makes the develop module react much faster, actually realtime, even if I spread Lightroom across my 3 monitors (2x19" and a 1920x1200). Much faster than without GPU and much faster than LR 5.x).I even tried large HDR panos and raw files from a 5DS.

I admit, that under certain circumstances, the brush may be slow. I admit that after zooming it may take a second to get the picture sharp. But I`m pretty sure, this will get better in the future, because, as Eric said, this is only the beginning. And even at that stage, I prefer the GPU enabled LR6 much to LR5 in terms of speed. And my old PC is for sure not one of the "1% that have top line computers".

It would also be helpful, and I do not mean any person in special here, to be as precise as possible in describing what actually is slow. If someone writes "Zooming is slow", what does it mean? Is it stuttering with slow framerates or is it taking time to become sharp? If going from one image to another is slow, what previews do you have, where are they etc.. What GPU/driver you you have? Have you tried other drivers, especially in case of having an AMD card? If brushing is slow, what are your brush settings? Do you brush "exposure", "sharpness" or whatever? Being as precise as possibe will help Adobe find out, what can be improved.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 11, 2015 May 11, 2015

Copy link to clipboard

Copied

Yes I really do think that Adobe took the easy way out.  Here are the facts of the case.  On my Machine a MacbookPro Late 2011 Lightroom CC is slower than Lightroom 5, with GPU enabled.  This is starting to be recognized elsewhere as more and more people are realizing they are seeing no real improvement in performance with GPU enabled and more times than not less performance than Lightroom 5.

So those people that were complaining about slow performance with previous Lightroom versions more than likely did not have a fast enough machine in the first place so GPU wouldn't help them achieve it since logically speaking they more than likely don't have a GPU enabled computer or the GPU is too old to be effective.

And just because other companies went to the GPU route doesn't mean really anything.  All it means is that when software encoding gets complicated for introducing speed in a product, the easiest way to go is to have said software utilize the top end hardware such as GPU.  The only problem is, is that when you go that route, always producing software that uses the latest technology or the top end hardware, then the consumer is left either not upgrading the software to the best version of itself or upgrading their computer every 6 months or so since most computers bought off of the shelf are already obsolete.

But with all that said, Lightroom CC has plenty of other bugs and oversights than just the GPU issue.....

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
May 11, 2015 May 11, 2015

Copy link to clipboard

Copied

I doubt very much GPU is the easy route, it is may be the most complex one, but if done correctly, it can be awesome. I use every day at my work, Autodesk Flame and Autodesk Lustre, a compositing and a Color Correction software based 100% on GPU processing. And the thing is amazingly fast, and no silly things about time to load images into the GPU. They load 30 5k frames per second on the GPU no problem, but, for certain operations they use CPU too, for example to decode RAW material form cameras like RED, they use all the cores on the system to the max. It's just know where you can get the best out of it.

Sent from the Moon

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
May 11, 2015 May 11, 2015

Copy link to clipboard

Copied

> If someone writes "Zooming is slow", what does it mean

do an exercise... download fastrawviewer ( About | FastRawViewer ), open raw, zoom in / zoom out / zoom in / zoom out / etc... open the same raw in LR6/ACR9 - do the same... feel the difference.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines