• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
12

GPU notes for Lightroom CC (2015)

Adobe Employee ,
Apr 25, 2015 Apr 25, 2015

Copy link to clipboard

Copied

Hi everyone,

I wanted to share some additional information regarding GPU support in Lr CC.

Lr can now use graphics processors (GPUs) to accelerate interactive image editing in Develop. A big reason that we started here is the recent development and increased availability of high-res displays, such as 4K and 5K monitors. To give you some numbers: a standard HD screen is 2 megapixels (MP), a MacBook Retina Pro 15" is 5 MP, a 4K display is 8 MP, and a 5K display is a whopping 15 MP. This means on a 4K display we need to render and display 4 times as many pixels as on a standard HD display. Using the GPU can provide a significant speedup (10x or more) on high-res displays. The bigger the screen, the bigger the win.

For example, on my test system with a 4K display, adjusting the White Balance and Exposure sliders in Lightroom 5.7 (without GPU support) is about 5 frames/second -- manageable, but choppy and hard to control. The same sliders in Lightroom 6.0 now run smoothly at 60 FPS.

So why doesn't everything feel faster?

Well, GPUs can be enormously helpful in speeding up many tasks. But they're complex and involve some tradeoffs, which I'd like to take a moment to explain.

First, rewriting software to take full advantage of GPUs is a lot of work and takes time. Especially for software like Lightroom, which offers a rich feature set developed over many years and release versions. So the first tradeoff is that, for this particular version of Lightroom, we weren't able to take advantage of the GPU to speed up everything. Given our limited time, we needed to pick and choose specific areas of Lightroom to optimize. The area that I started with was interactive image editing in Develop, and even then, I didn't manage to speed up everything yet (more on this later).

Second, GPUs are marvelous at high-speed computation, but there's some overhead. For example, it takes time to get data from the main processor (CPU) over to the GPU. In the case of high-res images and big screens, that can take a LOT of time. This means that some operations may actually take longer when using the GPU, such as the time to load the full-resolution image, and the time to switch from one image to another.

Third, GPUs aren't best for everything. For example, decompressing sequential bits of data from a file -- like most raw files, for instance -- sees little to no benefit from a GPU implementation.

Fourth, Lightroom has a sophisticated raw processing pipeline (such as tone mapping HDR images with Highlights and Shadows), and running this efficiently on a GPU requires a fairly powerful GPU. Cards that may work with in the Photoshop app itself may not necessarily work with Lightroom. While cards that are 4 to 5 years old may technically work, they may provide little to no benefit over the regular CPU when processing images in Lr, and in some cases may be slower. Higher-end GPUs from the last 2 to 3 years should work better.

So let's clear up what's currently GPU accelerated in Lr CC and what's not:

First of all, Develop is the only module that currently has GPU acceleration whatsoever. This means that other functions and modules, such as Library, Export, and Quick Develop, do not use the GPU (performance should be the same for those functions regardless of whether you have GPU enabled or disabled in the prefs).

Within Develop, most image editing controls have full GPU acceleration, including the basic and tone panel, panning and zooming, crop and straighten, lens corrections, gradients, and radial filter. Some controls, such as local brush adjustments and spot clone/heal, do not -- at least, not yet.

While the above description may be disappointing to some of you, let's be clear: This is the beginning of the GPU story for Lightroom, not the end. The vision here is to expand our use of the GPU and other technologies over time to improve performance. I know that many photographers have been asking us for improved performance for a long time, and we're trying to respond to that. Please understand this is a big step in that direction, but it's just the first step. The rest of it will take some time.

Summary:

1. GPU support is currently available in Develop only.

2. Most (but not all) Develop controls benefit from GPU acceleration.

3. Using the GPU involves some overhead (there's no free lunch). This may make some operations take longer, such as image-to-image switching or zooming to 1:1. Newer GPUs and computer systems minimize this overhead.

4. The GPU performance improvement in Develop is more noticeable on higher-resolution displays such as 4K. The bigger the display, the bigger the win.

5. Prefer newer GPUs (faster models within the last 3 years). Lightroom may technically work on older GPUs (4 to 5 years old) but likely will not benefit much. At least 1 GB of GPU memory. 2 GB is better.

6. We're currently investigating using GPUs and other technologies to improve performance in Develop and other areas of the app going forward.

The above notes also apply to Camera Raw 9.0 for Photoshop/Bridge CC.

Eric Chan

Camera Raw Engineer

Views

175.0K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
replies 353 Replies 353
Enthusiast ,
May 18, 2015 May 18, 2015

Copy link to clipboard

Copied

> What was that you were saying?


he was probably alleging that GeForce gaming cards are not as good with FP64bit math on chip as with FP32bit and other operations and hence the issues ... but how does he know what ACR/LR does in code, does it use FP64bit to for example to zoom in/zoom out ? I doubt though... I do suggest as usual him to run fastrawviewer (yes, it does raw conversion too) and see how the proper code works

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
May 19, 2015 May 19, 2015

Copy link to clipboard

Copied

it is a fact that some gaming cards have a lot more muscle than some of the Quadro cards. In fact, usually gaming cards come out first because are the ones that makes most of the revenue for Nvidia, so it is very common to see gaming cards with a lot more CUDA cores than the latest Quadro card. In fact, for one of the software that I use for my dayjob, (color correction, compositing, visual effects...) one of the beta testers is always testing just for the pleasure of testing, the latest gaming cards, and ALWAYS, no exception, gets better performance than with the latest Quadro card. If you code the software correctly, you can do amazing stuff with a gaming GPU, and let's face it, it is what probably 95% of the LR users have. So for who should they write the software? for that 5% that have Quadro cards? or for the remainder 95% that have gaming cards?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
May 19, 2015 May 19, 2015

Copy link to clipboard

Copied

the fact is very simple - the best Mac notebook, MacBookPro till ~today was with __JUST__ nVidia 750m 🙂 and Apple just put new one today - none is actually in end users hands, if I am not mistaken, with AMD Radeon R9 M370X which is not more poweful than gaming GTX 970M or 980M  ... so ACR/LR code __must__ work well on nVidia 750m w/ all operations - because it is kinda logical to assume that those notebooks is what photogs will be using at best and it does not.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 19, 2015 May 19, 2015

Copy link to clipboard

Copied

Be careful about assuming ... most photog I know do not use mac book pros for there main editing rig.

I do know quite a few that have them as an on location or on-the-go editor.  They mainly use to cull a wedding.

Also, we are talking about 1 graphics card and what ... a 15" monitor/screen.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
May 19, 2015 May 19, 2015

Copy link to clipboard

Copied

> most photog I know do not use mac book pros for there main editing rig.

you are missing the point here, dear... who says photogs - think ACR/LR users and feel the difference between the 2 things... and the point is that LR/ACR are not marketed as desktop (vs notebook) apps only... hence they have to be made to run at least on the best notebooks and for Mac (you are not disputing that Macs are being used by a lot of of ACR/LR users) Macbookpro is the most powerful notebooks (yes, you can do hackintosh, but that's 0.01% of the populace)... and if you app can't perform on the top macbookpro you are not doing your job as a product manager whoever that dude is in Adobe.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 19, 2015 May 19, 2015

Copy link to clipboard

Copied

deejjjaaaa,

Cool .... I am a dear.... Nice.  That's good internet etiquette.

I know LR has turned into the be-all for any device editor: phone, pablets, tablets, laptops and workstations.

Completely agree that LR should work on laptops.

Now that we both agree, what is your point.....

Catch you later.  I need to go shoot a bridal session and edit 2 weddings.

-Mark

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
May 19, 2015 May 19, 2015

Copy link to clipboard

Copied

IMO, you don`t need a powerful GPU at all for LR, if it works correctly. My GTX 970 for example nearly never goes above 10-15% usage. This may change, of course, once the GPU is used for export as well.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
May 19, 2015 May 19, 2015

Copy link to clipboard

Copied

klsteven wrote:

IMO, you don`t need a powerful GPU at all for LR, if it works correctly. My GTX 970 for example nearly never goes above 10-15% usage. This may change, of course, once the GPU is used for export as well.

Your GTX 970 is a very high-performance GPU as indicated at Passmark Benchmark site. It's 8,649 Passmark score is only 20% below the highest performance GPU benchmark (11,206). My Nvidia Quadro 600 GPU with 681 Passmark score is about 1/10th the performance of your GTX 970 and is working properly with LR CC. So it does meet your criteria, "My GTX 970 for example nearly never goes above 10-15% usage."

It does slightly speed up the slider response in the Develop module, but that isn't slow in LR5.7 so of little benefit. Switching image files in the Develop module and many other operations are much slower with GPU enabled. Overall performance is noticeably better with GPU disabled in LR CC.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 19, 2015 May 19, 2015

Copy link to clipboard

Copied

trshaner,

The G3D test are cool.  But last time I checked this is a test for DirectX.

LR6 is OpenGL.  Just saying.

Another thought ... I am not laying down the absolute on graphic cards requirement's.  Nope ... know way.

Just trying to bring some perspective.

1 video card and 1 monitor is one thing.  But the same card driving multiple monitor setups with some of these monitors being UHD is different.

It will be interesting to see what the min requirements are 6months to a year from now.

Just my 2 cents.

Totally agree that there are issue that adobe needs to straighten out.

-Mark

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
May 19, 2015 May 19, 2015

Copy link to clipboard

Copied

mark-FOTO wrote:

trshaner,

The G3D test are cool.  But last time I checked this is a test for DirectX.

LR6 is OpenGL.  Just saying.

Agreed. The Passmark Benchmark Score is a still good indicator of the GPU's "relative" processing speed. If a GPU has a Passmark Score 10x over another GPU then that speed difference should be relatively close to 10x for both DirectX and OpenGL apps. I'm sure there are exceptions.....and LR 6/CC is one of those exceptions.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
May 23, 2015 May 23, 2015

Copy link to clipboard

Copied

EVGA GTX 970 SC - No Difference GPU checked

I have a high-performance EVGA GTX 970 SC and see no performance changes with/without GPU checked.  I have a pretty high performance machine, two monitors with one being a high res monitor.  I ran some loose benchmarks and just don't see a difference.  My develop sliders have a "two stage" render if I move them quickly and moving from pic to pic in the bar below results in about a 3 second lag to render..... all regardless of cpu being checked.  Odd, I thought this GPU would help given that it was designed to help with these higher res monitors.

My conclusion is that if you have money to spend and your have a relatively newer graphics card already installed, you may be better served by upgrading something in your PC like to a SDD, or a new CPU.  I wish LR would use some of that idle, expensive memory I have sitting in my machine to pre-fetch and cache the pics that in proximity (in the pic row) to speed moving from pic to pic in Develop.

Peace

Bruce in Philly

Graphics card: EVGA GTX 970 SC

Asus Sabertooth X79 motherboard

32GB RAM

Windows 7 64

Intel i7-3930K 3.2ghz

Monitor 1:NEC PA301W 30" 2560x1600 (photo editing)

Monitor 2 : HP 27 Xi 1080x1920 (aux screen for email and music player etc)

SSD: Sandisk Extreme for OS, LR, LR cache, LR cat

2 7200 HDs in RAID 0 for RAWS

From LR:

Lightroom version: 6.0.1 [ 1018573 ]

License: Perpetual

Operating system: Windows 7 Business Edition

Version: 6.1 [7601]

Application architecture: x64

System architecture: x64

Logical processor count: 12

Processor speed: 3.2 GHz

Built-in memory: 32707.2 MB

Real memory available to Lightroom: 32707.2 MB

Real memory used by Lightroom: 2503.2 MB (7.6%)

Virtual memory used by Lightroom: 2525.3 MB

Memory cache size: 4599.6 MB

Maximum thread count used by Camera Raw: 6

Camera Raw SIMD optimization: SSE2,AVX

System DPI setting: 120 DPI

Desktop composition enabled: Yes

Displays: 1) 2560x1600, 2) 1920x1080

Input types: Multitouch: No, Integrated touch: No, Integrated pen: No, External touch: No, External pen: No, Keyboard: No

Graphics Processor Info:

GeForce GTX 970/PCIe/SSE2

Check OpenGL support: Passed

Vendor: NVIDIA Corporation

Version: 3.3.0 NVIDIA 352.86

Renderer: GeForce GTX 970/PCIe/SSE2

LanguageVersion: 3.30 NVIDIA via Cg compiler

......

Direct2DEnabled: false

GPUDevice: not available

OGLEnabled: true

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Jun 09, 2015 Jun 09, 2015

Copy link to clipboard

Copied

Direct2DEnabled: false

GPUDevice: not available

OGLEnabled: true

Your log seems to indicate that the GPU Device is not available??

Ian

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jun 10, 2015 Jun 10, 2015

Copy link to clipboard

Copied

Actually I think that a separate GPU device, like a Nvidia Tesla is what's meant, not the graphics card.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
May 19, 2015 May 19, 2015

Copy link to clipboard

Copied

> IMO, you don`t need a powerful GPU at all for LR, if it works correctly.

there is a difference between working correctly (like no crash) and working properly fast...

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 21, 2015 May 21, 2015

Copy link to clipboard

Copied

Unhappy with the brushes, too on my late 2013 Mac Pro with FirePro D700 ... Came across another severe issue: the overlay of the highlights clipping changes with/without GPU enabled:

1.png2.png

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
May 18, 2015 May 18, 2015

Copy link to clipboard

Copied

> it's a driver for The Witcher 3 which I plan to play starting tomorrow.

lucky you - you have time to play , but with all those troubles that Adobe has I 'd refrain from upgrading anything

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 18, 2015 May 18, 2015

Copy link to clipboard

Copied

>not sure why do you install "game ready" drivers ? to game ? non gamers need to use production versions unless directed by NVidia to use a specific later version... for example for my config the most recent not "game ready" (read quick fix catering to gamers) driver is<

Because Adobe tell us to install the latest drives:

Q: What are the minimum requirements for GPU support?

Minimum requirements are a graphics card that runs on OpenGL 3.3 and later. Please also ensure that your card is running the latest drivers. On Mac, you can do this by updating to the latest operating system updates. On Windows, please update by downloading and installing the latest drivers from your manufacturer’s website:

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
May 18, 2015 May 18, 2015

Copy link to clipboard

Copied

> On Windows, please update by downloading and installing the latest drivers from your manufacturer’s website:

then it is wrong wording for sure

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
May 19, 2015 May 19, 2015

Copy link to clipboard

Copied

I just wish that the Adobe engineers would just update us on where this is heading.  By now they must have a handle on the problem and it would be nice if they would say something.  Even "this si a bigger problem than we first thought at first and there will be no easy fix" or "AMD cards are a big problem we recommend Nvidia".

Anything would help 😞

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
May 19, 2015 May 19, 2015

Copy link to clipboard

Copied

The new LRCC and PSCC updates have really slowed things down.  I have a fast system (system info below) and it runs much slower with the Graphics Card (GTX 650) enabled.  I think it is also a little slower with the graphics card enabled.  Photoshop however is severely crippled. Certain features like healing brush is unusable with OR without the graphics card enabled. How would I back level PS prior to the update that supposedly sped things up? I can live with LRCC with graphics processor disabled but not PS. BTW I also have a wacom tablet but unplugging it does not seem to have any impact.

Lightroom version: CC 2015.0.1 [ 1018573 ]

License: Creative Cloud

Operating system: Windows 8.1 Business Edition

Version: 6.3 [9600]

Application architecture: x64

System architecture: x64

Logical processor count: 8

Processor speed: 4.0 GHz

Built-in memory: 16330.4 MB

Real memory available to Lightroom: 16330.4 MB

Real memory used by Lightroom: 2178.0 MB (13.3%)

Virtual memory used by Lightroom: 2174.3 MB

Memory cache size: 1876.3 MB

Maximum thread count used by Camera Raw: 8

Camera Raw SIMD optimization: SSE2,AVX

System DPI setting: 96 DPI

Desktop composition enabled: Yes

Displays: 1) 1920x1200, 2) 1920x1200

Input types: Multitouch: No, Integrated touch: No, Integrated pen: No, External touch: No, External pen: Yes, Keyboard: No

Graphics Processor Info:

GeForce GTX 650/PCIe/SSE2

Check OpenGL support: Passed

Vendor: NVIDIA Corporation

Version: 3.3.0 NVIDIA 352.86

Renderer: GeForce GTX 650/PCIe/SSE2

LanguageVersion: 3.30 NVIDIA via Cg compiler

Application folder: C:\Program Files\Adobe\Adobe Lightroom

Library Path: C:\Lightroom Main Catalog\Lightroom Catalog.lrcat

Settings Folder: C:\Users\Patrick\AppData\Roaming\Adobe\Lightroom

Installed Plugins:

1) Behance

2) Canon Tether Plugin

3) Facebook

4) Flickr

5) HDR Efex Pro 2

6) Leica Tether Plugin

7) Nikon Tether Plugin

8) Perfect B&&W 8

9) Perfect Effects 8

10) Perfect Enhance 8

11) Perfect Photo Suite 8

12) Perfect Portrait 8

13) Perfect Resize 8

14) rc Publish Service Assistant

15) SmugMug

Config.lua flags: None

Updated Toolkit: Adobe Camera Raw 9.0 for Lightroom 6.0 (build 1014445)

Updated Toolkit: Book Module 6.0 (build 1014445)

Updated Toolkit: Develop Module 6.0 (build 1014445)

Updated Toolkit: Import Module 6.0 (build 1014445)

Updated Toolkit: Library Module 6.0 (build 1014445)

Updated Toolkit: Map Module 6.0 (build 1014445)

Updated Toolkit: Monitor Module 6.0 (build 1014445)

Updated Toolkit: Print Module 6.0 (build 1014445)

Updated Toolkit: Slideshow Module 6.0 (build 1014445)

Updated Toolkit: Web Module 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.AgNetClient 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.AgWFBridge 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.Headlights 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.LibraryToolkit 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.MultiMonitorToolkit 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.archiving_toolkit 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.bridgetalk 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.catalogconverters 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.cef_toolkit 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.coretech_toolkit 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.curculio 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.discburning 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.email 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.export 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.ftpclient 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.help 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.iac 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.imageanalysis 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.layout_module_shared 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.pdf_toolkit 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.sdk 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.sec 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.socket 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.store_provider 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.substrate 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.ui 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.video_toolkit 6.0 (build 1014445)

Updated Toolkit: com.adobe.ag.xml 6.0 (build 1014445)

Updated Toolkit: com.adobe.wichitafoundation 6.0 (build 1014445)

Adapter #1: Vendor : 10de

  Device : fc6

  Subsystem : 84281043

  Revision : a1

  Video Memory : 966

Adapter #2: Vendor : 1414

  Device : 8c

  Subsystem : 0

  Revision : 0

  Video Memory : 0

AudioDeviceIOBlockSize: 1024

AudioDeviceName: Speakers (High Definition Audio Device)

AudioDeviceNumberOfChannels: 2

AudioDeviceSampleRate: 44100

Build: LR5x102

Direct2DEnabled: false

GPUDevice: not available

OGLEnabled: true

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
May 19, 2015 May 19, 2015

Copy link to clipboard

Copied

There is an old term for this... FUBAR.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
May 19, 2015 May 19, 2015

Copy link to clipboard

Copied

Okay, here is some other information that might be helpful.

I have an AMD Radeon 5500 Graphics Card (1 Gig) on an HP workstation with a 3.2Ghz Processor and 16Gig RAM.

I have been getting "Graphics Processor disabled due to errors" and the develop module performance was slightly slower than LR 5.7

I came across this link https://helpx.adobe.com/lightroom/kb/lightroom-amd-graphics-cards.html

After reading this page, I uninstalled the new drivers and reinstalled the prior 14.4 video driver from AMD. I restarted LR CC and no more Graphics Card errors. I tried a few things in the develop module and they really are quite a bit faster but not everything. The brush rendering is fast, the crop tool is slower. Image to image movement in the filmstrip is slower. Changes made in the Basic Panel and the Details Panel are fast. Overall, I think most things are a little slower than LR 5.7 except for those things I mentioned above.

Another curious thing is why Adobe removed the ability to apply develop presets when you select a group of images and then right-click and choose "Develop Settings" can paste from previous but now I can no longer apply one of my user created presets this way. Why remove this?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Jun 01, 2015 Jun 01, 2015

Copy link to clipboard

Copied

Lightroom CC 2015 is a disaster.

Nothing worst than poor performance. How do I go back to Lightroom 5? otherwise I might have to switch all together.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Jun 02, 2015 Jun 02, 2015

Copy link to clipboard

Copied

I think the practical approach is to reinstall LR5.7 and wait for the LR6.1 update that will hopefully fix these issues.  I suspect that we will have to wait until the new haze filter is ready.

I am currently using LR5.7 and will wait for the update before upgrading.

Ian

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Jun 09, 2015 Jun 09, 2015

Copy link to clipboard

Copied

Just to add my 2 cents, LR CC is a performance nightmare for me too. I have been waiting 2 hours for it to import 669 pictures, convert to DNG and render 1:1 previews, and it still looks like there is another 30 minutes of rendering to go. Everything is just painful compared to previous generations of Lightroom, it has actually put me off of photography as I just can't bear to use LR any more.

Specs:

i5-4690K @ 3.5 Ghz, 16GB RAM, GTX 770 GPU (have tried with it enabled or disabled)

Please address this issue Adobe, I am fed up and looking at alternatives to your software.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines