• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
12

GPU notes for Lightroom CC (2015)

Adobe Employee ,
Apr 25, 2015 Apr 25, 2015

Copy link to clipboard

Copied

Hi everyone,

I wanted to share some additional information regarding GPU support in Lr CC.

Lr can now use graphics processors (GPUs) to accelerate interactive image editing in Develop. A big reason that we started here is the recent development and increased availability of high-res displays, such as 4K and 5K monitors. To give you some numbers: a standard HD screen is 2 megapixels (MP), a MacBook Retina Pro 15" is 5 MP, a 4K display is 8 MP, and a 5K display is a whopping 15 MP. This means on a 4K display we need to render and display 4 times as many pixels as on a standard HD display. Using the GPU can provide a significant speedup (10x or more) on high-res displays. The bigger the screen, the bigger the win.

For example, on my test system with a 4K display, adjusting the White Balance and Exposure sliders in Lightroom 5.7 (without GPU support) is about 5 frames/second -- manageable, but choppy and hard to control. The same sliders in Lightroom 6.0 now run smoothly at 60 FPS.

So why doesn't everything feel faster?

Well, GPUs can be enormously helpful in speeding up many tasks. But they're complex and involve some tradeoffs, which I'd like to take a moment to explain.

First, rewriting software to take full advantage of GPUs is a lot of work and takes time. Especially for software like Lightroom, which offers a rich feature set developed over many years and release versions. So the first tradeoff is that, for this particular version of Lightroom, we weren't able to take advantage of the GPU to speed up everything. Given our limited time, we needed to pick and choose specific areas of Lightroom to optimize. The area that I started with was interactive image editing in Develop, and even then, I didn't manage to speed up everything yet (more on this later).

Second, GPUs are marvelous at high-speed computation, but there's some overhead. For example, it takes time to get data from the main processor (CPU) over to the GPU. In the case of high-res images and big screens, that can take a LOT of time. This means that some operations may actually take longer when using the GPU, such as the time to load the full-resolution image, and the time to switch from one image to another.

Third, GPUs aren't best for everything. For example, decompressing sequential bits of data from a file -- like most raw files, for instance -- sees little to no benefit from a GPU implementation.

Fourth, Lightroom has a sophisticated raw processing pipeline (such as tone mapping HDR images with Highlights and Shadows), and running this efficiently on a GPU requires a fairly powerful GPU. Cards that may work with in the Photoshop app itself may not necessarily work with Lightroom. While cards that are 4 to 5 years old may technically work, they may provide little to no benefit over the regular CPU when processing images in Lr, and in some cases may be slower. Higher-end GPUs from the last 2 to 3 years should work better.

So let's clear up what's currently GPU accelerated in Lr CC and what's not:

First of all, Develop is the only module that currently has GPU acceleration whatsoever. This means that other functions and modules, such as Library, Export, and Quick Develop, do not use the GPU (performance should be the same for those functions regardless of whether you have GPU enabled or disabled in the prefs).

Within Develop, most image editing controls have full GPU acceleration, including the basic and tone panel, panning and zooming, crop and straighten, lens corrections, gradients, and radial filter. Some controls, such as local brush adjustments and spot clone/heal, do not -- at least, not yet.

While the above description may be disappointing to some of you, let's be clear: This is the beginning of the GPU story for Lightroom, not the end. The vision here is to expand our use of the GPU and other technologies over time to improve performance. I know that many photographers have been asking us for improved performance for a long time, and we're trying to respond to that. Please understand this is a big step in that direction, but it's just the first step. The rest of it will take some time.

Summary:

1. GPU support is currently available in Develop only.

2. Most (but not all) Develop controls benefit from GPU acceleration.

3. Using the GPU involves some overhead (there's no free lunch). This may make some operations take longer, such as image-to-image switching or zooming to 1:1. Newer GPUs and computer systems minimize this overhead.

4. The GPU performance improvement in Develop is more noticeable on higher-resolution displays such as 4K. The bigger the display, the bigger the win.

5. Prefer newer GPUs (faster models within the last 3 years). Lightroom may technically work on older GPUs (4 to 5 years old) but likely will not benefit much. At least 1 GB of GPU memory. 2 GB is better.

6. We're currently investigating using GPUs and other technologies to improve performance in Develop and other areas of the app going forward.

The above notes also apply to Camera Raw 9.0 for Photoshop/Bridge CC.

Eric Chan

Camera Raw Engineer

Views

176.2K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
replies 353 Replies 353
Community Beginner ,
May 15, 2015 May 15, 2015

Copy link to clipboard

Copied

Sure, I’ll send you something when I get home tonight. I can’t give any of my clients photos, but I can give you a photo out of the Pentax 645Z. I do photo stacking and several layers in Photoshop, background, piece of jewelry, reflections, shadows, retouches, painting, cloning, and I keep everything separated in layers, so these are sometimes 1.5 GB tiff files that are about 9000x6000 pixels and 16 bits. But lightroom does not read the layers but just the render of all those layers together.

Here you can see how it behaves, and by the way, it behaves that bad even on a regular DNG straight from the camera.

https://www.youtube.com/watch?v=m1VisL8dF3Y

and here is 5.7.1 with the same

https://www.youtube.com/watch?v=7EEFjurtwxE

and here you can see what I do exactly

https://www.facebook.com/victorw.photo

http://www.victorwolanskyphoto.com/#!The-invisible-neck/c1ti3/33755DA7-7AAA-4573-B838-4D52E62D71C9

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 15, 2015 May 15, 2015

Copy link to clipboard

Copied

Victor,

My experience is exactly the same as your videos.

MacPro late 2013.  64gigs of memory and the Firepro 500 with 3gigs.

Mike

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
May 15, 2015 May 15, 2015

Copy link to clipboard

Copied

Playing games can not be compared to moving UHD uncompressed pixels. Even though I think LR CC is the worst version ever, gotta be fair in comparing apples to apples. Games use very small textures and things usually rendered in GPU, shaders that simulate things that are quickly processed on the GPU to "create" things. All the leverage is done by the GPU. On image editing, nothing is being created or simulated in the GPU, pixels need to be transferred from your HDD, after being decompressed from the RAW to the GPU, and you are constantly moving uncompressed images. Now, LR does this very poorly because I use other software that does color correction over UHD in real time over the same GPU I have for Lightroom, so I can assure you LR CC is not using it correctly, and I'm talking about 30 frames per second uncompressed UHD. But, you could have a much lesser GPU and still have good performance with games, since things are being created by the GPU. It's a totally different game. On this system I use, called Flame, we can load shaders on the GPU and create super crazy 3D things in real time, it's just a totally different process than editing images.

Sent from the Moon

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 15, 2015 May 15, 2015

Copy link to clipboard

Copied

I don't often post anywhere, but I did some extensive tests and I hope you all find this useful information on my real world experience and testing on a higher end rig / GPU for LR6

SPECS (skip if you don't care for it):

LR6 64 bit

Windows 7 Business X64

Processor speed: INTEL i7 3.9 GHz

Built-in memory: 32675.7 MB (RAM)

Real memory used by Lightroom: 1468.1 MB (4.4%)

2x DELL 2560x1600 (u3014 model)

2x GeForce GTX 980/PCIe/SSE2 in SLI (meaning both cards are piping max 3d performance to primary screen although I do not know how LR6 handles SLi)

GFX turned on VS CPU only processing

-More load times for images VS quicker loads

-INITIAL brushes: stuttering for the first brush strokes, delay in showing the effects VS instant brushing and application of effects

-As I apply the first brush strokes like a dodge/burn in continuous strokes of the first brush:  effects catch up to almost real time with just a slight delay VS consistently instant brushing and application of effects

-As I add NEW brush strokes/graduated filters/effects on top of existing ones: effects are now instantaneous with no lag or delay VS effects are now showing delayed effect

-As I really do extensive brushes etc (tested to almost 50 effects/brushes/sliders) over the entire image: effects are still instantaneous and lighting fast VS effects that show noticeable delay

Conclusion:

With my higher end rig, GFX turned on has many cons during the initial stages of editing and there is noticeable lag.  CPU only is fast during the initial stages.  As you do more edits, even after 5-7 different brush strokes applications, you see the speed difference.  GFX assisted does not stutter and remains fast/instantaneous despite how many edits I throw at it.  CPU starts to slow down around 5-7 brush strokes applied to the entire image.  As it goes into a super intensive (almost unrealistic edit), GFX remains fast while CPU has delays.

My opinions (coming from a wedding photographer perspective):

Will I use GFX assisted?  For now, yes, but it is annoying with just small edits or mass culling and/or basic editing of a wedding shoot (~6000 images)

Best use if you can remember: GFX off during culling and basic global adjustments.  GFX on during select edits and when you really get into it but don't want to open up PS.  However, it's not for me and I had having to go in and out of menus to enable/disable.

If Adobe can minimize or eliminate the initial lag/delay/overhead that the GFX assisted causes, then you have one hell of an engine.  I am excited to see how adobe develops this tech and hopefully it'll be patched over soon to perform better.  If Adobe can fix this issue and carry it over into PS, it would be even more awesome and I would love the increased speed when you are heavily editing an image.

Should you go out and buy or upgrade your GFX card?  HELL NO.  At least not in Adobe's current iteration of it's software.  If Adobe can make the initial stages of GFX on as fast as CPU only, then Yes, go get a GFX card.  But until Adobe can prove they can fix the issue, CPU only will be enough for the casual user confined to LR.

CAVEAT:  My dual graphics card in Sli were purchased only 6 months ago and sat at the #2 GFX card available.  So although I did 50 brush effects/changes applied over the entire image overlapping each other as fast as I could and I did NOT feel any slow down, your experience may vary with a lower end card.  or not.  I tried to monitor my GPU usage through a software application, but it wasn't showing any changes in GPU use % when I did the edits.  However, I did hear my GPU's fans spool up as I did the edits and slow down when I didn't apply effects so I know they were working. (My cards aren't quiet as they each have 3 fans directly cooling the cards).

Hope this helps everyone!  Hit LIKE below so other people searching can find this helpful.

www.conroytam.com

Keywords: Graphic acceleration lightroom CC, lightroom 6, LR6, LRCC, graphic processor acceleration lightroom, photoshop CS6

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 15, 2015 May 15, 2015

Copy link to clipboard

Copied

Okay, I did some magic. I don't why but GPU acceleration does work now and it' multiple times faster after first tests(cropping, treatment sliders, etc. )

What I did:

While on Nvidia driver 344.x

- Disabled the Full HD TV and the Full HD Monitor, leaving only the UHD monitor. No effect

- trying in the 3d settings to set to "single display performance mode instead of multiple. No effect.

- enabling the disabled TV and secondary monitor.

Updating Geforce experience and graphics driver to the latest version (350.12) -> after that the multiple display performance mode was activated automatically. Now Lightroom uses the GPU! I performed similar steps before and it didn't help then, I don't know what exactly helped this time but it works and I will not touch anything now .

I have a huge performance boost in UHD with the GPU enabled.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 15, 2015 May 15, 2015

Copy link to clipboard

Copied

P.S.

Graphics Processor Info:

GeForce GTX 970/PCIe/SSE2

Check OpenGL support: Passed

Vendor: NVIDIA Corporation

Version: 3.3.0 NVIDIA 350.12

Renderer: GeForce GTX 970/PCIe/SSE2

LanguageVersion: 3.30 NVIDIA via Cg compiler

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
May 15, 2015 May 15, 2015

Copy link to clipboard

Copied

Del_Pierro,

I`m glad, it worked for you!

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 18, 2015 May 18, 2015

Copy link to clipboard

Copied

It takes more than 20 seconds for Lightroom 6 to finish rendering a 16MP raw file on my relatively new, relatively high-end PC. I've tried 16MP files from three different cameras, and the performance is about the same in all 3 cases. I've tried with GPU enabled and disabled, with different preview sizes selected, with a clean install of the latest drivers for my graphics card, all manner of different system graphics settings and combinations, and nothing seems to help. I've even been in contact with NVidia support. They've had me try a couple of different things (clean driver install, change power management setting), also to no avail.

I'm running a 64 bit Win 8.1 PC with a 6-core Haswell-E i7 (5280K) processor, an NVidia GTX970, 16GB of RAM, OS and applications on an SSD. I have two monitors, one is 4K, the other 2K. I've disabled the 2K monitor, but that doesn't help. At best, after messing around with settings, I may get better performance for a few minutes, but things slow to a crawl again in no time.

As I mentioned, it takes more than 20 seconds (often more than 25 seconds) to finish rendering a 16MP file in the develop mode when the develop window is on my 4K monitor. When I move the development window to my 2K monitor, it still takes around 15 seconds to finish rendering an image. From time to time I have also seen the image load up completely sharp, then go blurry after a couple of seconds and take another 20 or so seconds to finish rendering again.

The performance is also unbearably slow when zooming into an image, as well as when making a number of different edits. After editing or zooming (develop mode on 4K monitor), the application stutters, sometimes goes completely black and sometimes flashes an image I had loaded up previously before displaying the image I was just working on. Various changes take forever to display - adjusting WB, for instance, sometimes takes multiple seconds before the adjustments are shown. When backing out a WB change (ctl+Z), I often see the image cycle back to the previous WB (correct) then cycle forward to the just-undone WB (incorrect), then cycle back to the previous WB again.

Certain adjustments cause parts of an image to appear strangely blurry for a few seconds before sharpening up again - imagine looking at an image of a fir tree that's sharp, then, all of a sudden, the fir tree looks more like deep green algae in a pond for a few seconds before sharpening back up - a bizarre effect I've never seen before.

In short, the application is all but unusable for me. I'd go back to 5.7, but Adobe hasn't added support for the Olympus E-M5 II, which I've started using recently, so I'm stuck either waiting for Adobe to fix this performance issue or shopping around for another solution. I've been using Lightroom since version 4, and I'd like to stick with it, but it's simply unworkably slow on my modern PC at the moment.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 18, 2015 May 18, 2015

Copy link to clipboard

Copied

I have some good news to report. At least right now, I'm seeing much better performance from LR6 with the same hardware and files as before.

I had been considering overclocking my system, but hadn't gotten around to it. My motherboard and graphics card are both from MSI, and the PC came with a bunch of MSI-specific overclocking software. I had opened it up shortly after first having received my PC in January, but had decided to hold off on overclocking for a bit.

Today I decided to take a look at the overclocking options again. I started out with the GPU. I used the MSI Afterburner application to create two new overclocking profiles - a sort of medium profile and a more extreme one, based off of a tutorial I read online. I tested LR6 with the extreme profile, and files that had taken 25 seconds to render previously were now rendering in 5-8 seconds. Not instantaneous, but a real improvement, and numbers that I can live with.

The interesting part, however, is that I re-enabled the default profile for my GPU (not overclocked) and tested LR6 again. The performance was almost as good as it was with the extreme overclocking profile enabled. Maybe a second or two slower to render an image than when the extreme profile was enabled, but nowhere near as slow as it had been previously.

I went ahead and took the opportunity to turn on the MSI Overclock Genie (a hardware-switch-enabled automatic CPU overclocker on my MSI motherboard), as well. It did boost the system's performance (3.3Mhz to 3.8Mhz on my 6 CPU cores), but I'm not sure whether there was really a noticeable improvement in LR6 performance.

It seems that there was something awry with the configuration of my GTX 970 that was reset when I overclocked the GPU, and it seems that that something remains more or less fixed even after "de-overclocking" my GPU. Something to consider for those who are seeing sluggish performance from LR6 with higher-end graphics cards.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 18, 2015 May 18, 2015

Copy link to clipboard

Copied

Good for you! I'm using much high end cards and computers and still performance is horrendous.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 19, 2015 May 19, 2015

Copy link to clipboard

Copied

I think the interesting takeaway from my experience is that the key to a massive improvement in performance on my PC seems to have been to monkey with the overclocking settings.

Again, I haven't measured with super precision, but the performance improvements remained in effect even after I set the overclocking settings back to default. It seems there may have been (and might be for others) some out-of-the box GPU configuration problem that was somehow reset or fixed when I played with the clock settings on the GPU (core voltage, power limit, core clock and memory clock, in my case). For anyone with a GTX 970, the aggressive overclocking settings I am currently using (with no negative issues, so far) are Core Voltage = +87, Power Limit = 110, Core Clock = +150 and Memory Clock = +500. I also set a more aggressive fan profile to make sure things don't heat up too much, but my fan has remained pretty quiet, and the GPU temperature has remained in a safe zone (mad 60 degrees C) since I overclocked the card

The only other thing I changed in between having poor performance and having improved performance was the "Change the size of all items" slider in Windows 8.1 (found in Control Panel -> Appearance and Personalisation -> Display). I'd had that one or two ticks to the right of the furthest left position. I pushed the slider all the way to the left and applied the change.

Two things that may be worth looking at for those who are still seeing major performance issues. LR 6 remains comparatively snappy for me today, and I'm still pushing two monitors with my GTX 970 - one 4K monitor with the development module on it and one 2K monitor with the second LR window in grid mode.

tex

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 19, 2015 May 19, 2015

Copy link to clipboard

Copied

One should not have to overclock a PC to get good performance. To start with, is not for the average user, second, you can screw your computer badly if you do t know what you doing. LR should work fine by itself right after finish installing like any other software.

Sent from the Moon

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 20, 2015 May 20, 2015

Copy link to clipboard

Copied

victorwolansky wrote:

One should not have to overclock a PC to get good performance.

I agree with this. The point I am trying to make, but perhaps not making clearly enough, is that I'm seeing real performance improvements after turning off the overclocking.


Which makes me think maybe it's an NVidia issue - like some config value is not set per default, but if you go in with an overclocking interface, make a few settings changes, apply the changes, then reset the settings (thereby disabling any overclocking), the config value gets set.

It's mostly a tip that others may wish to try. The point for me is that it doesn't seem to have been Adobe's fault in my case. It's very likely not Adobe's fault in your case, either.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
May 19, 2015 May 19, 2015

Copy link to clipboard

Copied

well, it looks like my last response to you was censored, It never showed up...  hope is just a glitch on the system and not moderators trying to shut me up....  I said sort of, good it worked for you, but it should not be what we should do, overclock a machine is not for everyone, you can seriously screw a machine by doing it if you do not know what you doing, and, LR should work correctly out of the box without a "technician" tuning your machine.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
May 18, 2015 May 18, 2015

Copy link to clipboard

Copied

This is my experience. I have a 4k monitor (Dell UP2414Q) and used to run it with a Mac Mini Late 2014 (the midrange model) and LR 5.7 and everything was fine. As fast as my 4 years old laptop and Full HD screen. So no particular issue related to 4k (only thing is that the Mini could get 30hz not 60hz but no big deal for photo editing).

Now I replaced the Mac Mini with the latest Macbook Pro 13" (early 2015) and LR 5 with 6, same screen.

Overall it is a big step back in terms of speed, I find ironic that speed is the main selling point for Lr6. Flipping through pictures takes for ever and when you are back from a travel and have 2,000 pics to sort/select/discard that's many hours added to workflow.

Many things are slower while some are just not usable, like the brush for the gradient and graduated filter. Not sure what is the best setting for my hardware. Have to try with GPU disabled.

It took longer than usual to release this version of LR. Not many features added and slower workflow. Hope it will get better soon. In the meanwhile I cannot use LR5 now that my catalog has been migrated to LR6, right?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 18, 2015 May 18, 2015

Copy link to clipboard

Copied

It took longer than usual to release this version of LR. Not many features added and slower workflow. Hope it will get better soon. In the meanwhile I cannot use LR5 now that my catalog has been migrated to LR6, right?

Your LR 5 catalog should still be there and you should be able to use LR 5 just fine with it. When you installed LR 6 it created an upgraded copy of your catalog and left the LR 5 version alone. It just won't have any of the imports and changes you did in 6.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 18, 2015 May 18, 2015

Copy link to clipboard

Copied

New NVidia driver today installed - Lightroom disabled GPU support immediately. I'm giving up on this program and won't buy it after the Trial expires until this mess is fixed, life is to short.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
May 18, 2015 May 18, 2015

Copy link to clipboard

Copied

> New NVidia driver today installed - Lightroom disabled GPU support immediately. I'm giving up on this program and won't buy it after the Trial expires until this mess is fixed, life is to short.inst

not sure why do you install "game ready" drivers ? to game ? non gamers need to use production versions unless directed by NVidia to use a specific later version... for example for my config the most recent not "game ready" (read quick fix catering to gamers) driver is

GeForce 341.44 Driver WHQL341.44February 24, 2015


it works and Adobe code works (slow yes - but then we know whom to blame and that's not NVidia in this case)

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 18, 2015 May 18, 2015

Copy link to clipboard

Copied

Well, it's a driver for The Witcher 3 which I plan to play starting tomorrow. I am pretty sure this driver does not disable OpenGL support so I have no idea why Lightroom is whining again. I tired the last WHQL driver, too in the past and it didn't work. Actually, the reasons why GPU support works or not seems totally random to me.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 18, 2015 May 18, 2015

Copy link to clipboard

Copied

Del_Pierro,

No offence but you are trying to game and work on the same PC.

And news flash ... Adobe took the gloves off of LR.

Witcher 3 is based on derectX and Lightroom is about OpenGL/OpenCl.

These are completely different rendering-graphic drivers.

Just don't get why users think gaming graphics cards should have the muscle to do professional graphics rendering.

And no I am not an adobe fan boy.  Just a working pro providing solid results for my clients.

I do believe adobe has some issues on installation, minor lag time optimization .....  and Victors issue is crazy for sure.

just my 2 cents....

Good luck

-Mark

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 18, 2015 May 18, 2015

Copy link to clipboard

Copied

Mark,

no offense taken, I appreciate your opinion. I just don't see something wrong in gaming and working on the same PC. Most PC Lightroom users have consumer graphic cards I assume. I know that OpenGL and DirectX are different but the card accelerates both (and many games work on OpenGL, too).

But all of this is not leading us anywhere. Lightroom has significant problems in it's current state. The performance is sufficient for me when GPU support is enabled so I am just complaining about the very poor implementation / detection of the graphics hardware.

Cheers,

Pierre

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 18, 2015 May 18, 2015

Copy link to clipboard

Copied

Pierre,

Completely hear you on the GPU detection.

Good hunting on the Witcher!

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
May 18, 2015 May 18, 2015

Copy link to clipboard

Copied

mark-FOTO<https://forums.adobe.com/people/mark-FOTO>, Having a Quadro card does makes no difference, in fact, many gaming cards have plenty of muscle, a lot more than some Quadro card where you paying a lot more money for a supposedly better driver, but hardware is pretty much the same. I have 3 different Quadro cards at my office, all of the computers are HP Z820, and Lightroom is incredible slow on all of the Quadro cards, which are professional cards. So that makes absolutely no difference, and not everyone can buy $1K + cards.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 18, 2015 May 18, 2015

Copy link to clipboard

Copied

I am on your side Victor.

What you have going on is crazy and I would be pulling my short hair out TOO!

Just lots of issues are happening on what I would call low to maybe mid level gaming cards paired with 4K single/ multi display setup.

Not a good match.  People buy a nice monitor and yet, users expect these cards to work wonders.

It would be like pairing a sweet engine from Porsche with a ford pinto transmission.

I hope Adobe gets a handle on things for everybody sanity.

Even though my modest machine is a 95% good to go with LR6.

The 5% being little glitches here and there.

I down loaded Capture One because of the issues with LR.

Not sure if I will change over ... Interesting software though,

I still am using photomechainic for culling because adobe can not figure that fast image generation in the library module.

Here to the coming week or so ... hope Adobe LR software crew cleans everything up.

-Mark

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 18, 2015 May 18, 2015

Copy link to clipboard

Copied

mark-FOTO wrote:

Del_Pierro,

Just don't get why users think gaming graphics cards should have the muscle to do professional graphics rendering.

And I quote, from the horse's mouth:

Suggested graphics cards:

...

  • For NVIDIA cards, consider a card from the GeForce GTX 760+ line (760, 770, 780, ...) or from the GeForce GTX 900 series.

What was that you were saying?

* For those who aren't familiar with the listed cards, they're the very "gaming graphics cards" that Mark doesn't think should have the muscle to "do professional graphics rendering". And they happen to be the exact cards Adobe officially recommends for use with Lightroom 6.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines