• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
105

P: GPU & Multiprocessor acceleration

LEGEND ,
Jun 10, 2011 Jun 10, 2011

Copy link to clipboard

Copied

It would be great if Lightroom 4 had GPU support for CUDA enabled video cards similar to the Mercury Playback Engine in CS5. That would really speed up performance! Every couple of seconds helps when you are editing 1000s of files.

Saving 2 seconds between images comes out to be an hour of saved time when you are editing 2000 images. I have 12 GB of RAM, and 12 core processors at my disposal and sometimes i have 4 seconds between images.

Multiprocessor support would be great as well!

Idea Released
TOPICS
macOS , Windows

Views

863

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
75 Comments
Contributor ,
Jun 05, 2012 Jun 05, 2012

Copy link to clipboard

Copied

But we're talking about Lightroom here, not Photoshop.

Votes

Translate

Translate

Report

Report
LEGEND ,
Jun 06, 2012 Jun 06, 2012

Copy link to clipboard

Copied

"GPU Support is not magic fairy dust that would make all aspects of Lightroom faster"
Who teaches you guys how to handle customer recovery efforts?

Votes

Translate

Translate

Report

Report
New Here ,
Jun 10, 2012 Jun 10, 2012

Copy link to clipboard

Copied

From my experiences Lightroom (LR) 4.1 is still not optimized to the fullest. Increasing LR 4.1 cache to 50GB was a big help. No other graphics/video intensive application that I have uses the resources like LR 4.1.

NEF files converted to DNG in LR 4.1 can take up to twice as long as the NEF files to load in the library and development modules. The DNG files also take 1.5 to 3 takes longer to load in 4.1 vs LR 3.6.

Although LR 4.1 is much better than ver 4 there is still a delay when using the sliders in the development module that is not present in the LR 3.6 with the same files. DNG more so than NEF files.

Intel Core2 Quad Q6700
8 GB Ram
Win7 x64 OS
Nvidia GForce 9500GS
2 - 1.5TB HD: OS on one, LR and photos on the other.

Votes

Translate

Translate

Report

Report
New Here ,
Feb 17, 2013 Feb 17, 2013

Copy link to clipboard

Copied

and the difference is? both take an image and render it.
Though it behooves me to surmise then that if it IS a rendering engine issue, and adobe used two different rendering engines, why reinvent the wheel - yoink the photoshop engine that has the gpu capability and stuff it into lightroom - and add the "enable / disable option" for those who want or must have software rendering.
Lightroom is supposed to be awesome with photos. It's not. It's slow. Has been since day one - gets better, but I can render a 200" 2gig illustrator file in photoshop by the time I'm done "importing" 1000 photos @ 1:1 preview.

Votes

Translate

Translate

Report

Report
Explorer ,
Mar 22, 2013 Mar 22, 2013

Copy link to clipboard

Copied

I would expect Adobe to keep on improving performance of their products by, for example, making better use of GPU capabilities, so I think this request rather pointless.

Votes

Translate

Translate

Report

Report
LEGEND ,
Apr 03, 2013 Apr 03, 2013

Copy link to clipboard

Copied

Lightroom uses less than half computing power of PC because it doesn't use GPU - it is not a solution, it is a PROBLEM! And result of this problem is slowness.

GPU with its very fast RAM waiting for lightroom to use it!

Votes

Translate

Translate

Report

Report
LEGEND ,
Apr 04, 2013 Apr 04, 2013

Copy link to clipboard

Copied

Why won't you use experience of Premiere Pro CS6 and Photoshop CS6 developers? They made a successful work around using GPU computing power to speed-up their software.
Problem of Lightroom is that it could me many times faster, but it is not so.

Votes

Translate

Translate

Report

Report
New Here ,
Apr 04, 2013 Apr 04, 2013

Copy link to clipboard

Copied

as I posted in another thread, if you actually disable modules that you don't use, - and we're not talking "hiding them" from menus here - you may significantly increase lightroom's responsiveness. It's no GPU solution for things like Capture One offers, but it definitely helps

I just updated to 4.4 and used it for a day without any modifications and it had zero improvement over 4.3 in regards to performance and was in fact significantly slower than what I had grown used to with my modded 4.3.

creating a "disabled modules" folder and moving Book, Layout, Print, Slideshow and Web .lrmodule files allowed me to again get back to a proper and snappy "for lightroom" perfomance.

Note, if you need these "features" then you're crap out of luck from this benefit. But I don't need to make books or print or make web junk from lightroom - i use it to catalog images and metatags, make corrections and export the images needed used for production in the software built for production - whatever the task may be.

Votes

Translate

Translate

Report

Report
Explorer ,
May 05, 2013 May 05, 2013

Copy link to clipboard

Copied

From my understanding, Invida has application that are developed to leverage the power of the GPU to speed up some specialized serial tasks. With the progression of resolution in cameras ie.. Nikon D800 and high megapixel, medium format cameras, computer hardware seems to lagging behind in all but the most cutting consumer accessible machines. Building Lightroom to take advantage of already present hardware, can lower the hardware barrier to performance entry, for much of the consumer market. My computer was built on a core i7 platform, 12GB ram and SSDs, and struggles with 40-50 mp raw files. If progressively higher resolution is the future in multi media, and hardware is the bottleneck, we need to find alternatives to achieving consistent and reasonable performance with current technologies.

Votes

Translate

Translate

Report

Report
LEGEND ,
Sep 11, 2013 Sep 11, 2013

Copy link to clipboard

Copied

Pathetic that this was open over 2 years ago.. and still LR5 has no GPU acceleration! and still is slow as &$/*"

Votes

Translate

Translate

Report

Report
New Here ,
Dec 06, 2013 Dec 06, 2013

Copy link to clipboard

Copied

Both today and in the future the photos have to be edited as soon as possible. It is what worries most professional photographers. Deliver work on schedule sometimes becomes a drama. With the technological increase, could have at our disposal today very fast computers, which is no longer a concern for us. The computer hardware is a technological level very good for photo editing. What happens sometimes are cases such as Adobe with Photoshop Lightroom, there seems to be willingness to follow the need of those living with deadlines for their clients. Everyone knows that lightroom has problems of delays and slowness in processing RAW files. I hope you consider these needs and provide a product of excellence for professionals take pride in working with Adobe products. Sorry for my English.

Votes

Translate

Translate

Report

Report
LEGEND ,
Dec 16, 2013 Dec 16, 2013

Copy link to clipboard

Copied

My extreme sympathies to Chris Cox for all the non-listening users posting here and not following his simple instructions. As soon as I have a bonafide performance issue I will post it here and I promise not to demand you take a GPU path to fixing it.

Of course, with the Mac Pro 2013 due out any time, I wonder what steps Adobe will take to utilize it, if any, in Lightroom. It's a heck of a lot of power, and seems it will be a model of pro desktop design going forward.

Thanks Adobe for making awesome products (so glad I don't work there anymore...)

Votes

Translate

Translate

Report

Report
LEGEND ,
Jan 01, 2014 Jan 01, 2014

Copy link to clipboard

Copied

I'd also like to throw my wish for GPU-acceleration into the balance. With APUs and hUMA here i can't understand why there's not even experimental, elementary support there. I'm still with Lightroom 3 and will get the first new version that is significantly faster when browsing through the images.

I guess you'll know about about the raw editing software "Darktable" - it's coded with so little manpower and they got a significant speed up by Opencl. (link http://www.darktable.org/2012/03/dark... ). I tried it and it's really fast and stable at the same time (at least for me).

I don't want to complain only...i really like working with lightroom.

Votes

Translate

Translate

Report

Report
LEGEND ,
Jan 01, 2014 Jan 01, 2014

Copy link to clipboard

Copied

The difference is that in Photoshop: displaying an image is just compositing and color conversion. In LightRoom or Camera RAW it is reading a file, demosaicing, applying a lot of color conversions, possibly applying a bunch of filters and corrections. While Photoshop and Lightroom have a lot of math in common, they are used differently, and making Lightroom/ACR efficient on the GPU isn't nearly as easy as it sounds.

Votes

Translate

Translate

Report

Report
LEGEND ,
Jan 02, 2014 Jan 02, 2014

Copy link to clipboard

Copied

This could get a speed up by exporting several Pictures of the batch at once - more threads, more use of the cpu. No Lags while reading, writing to the disk etc.

Votes

Translate

Translate

Report

Report
LEGEND ,
Jan 02, 2014 Jan 02, 2014

Copy link to clipboard

Copied

But why then does comparable software get those massive boosts from OpenCL usage? They do the same tasks and while beeing even less multithreaded due to less effort that can be applied to the code. Another thing, I noticed there are 3 Pictures rendered ahead, what i welcome. Could there be implemented an option how many Previews an rendered ahead? Thanks for looking in here 😉

Votes

Translate

Translate

Report

Report
Engaged ,
Jan 02, 2014 Jan 02, 2014

Copy link to clipboard

Copied

".....and making Lightroom/ACR efficient on the GPU isn't nearly as easy as it sounds."
I can imagine that Chris, but.....:
"We choose to go to the moon in this decade and do the other things. Not because they are easy, but because they are hard" 🙂

Votes

Translate

Translate

Report

Report
LEGEND ,
Jan 02, 2014 Jan 02, 2014

Copy link to clipboard

Copied

Bad code can easily get a boost from OpenCL (or vectorization, or threading), but code that is already highly tuned will see much less benefit.

Lightroom/ACR has been doing the hard work and optimized their code, and GPU support doesn't necessarily provide a huge boost in performance at such a low cost as it does to simpler and less optimized applications. Again, the Lightroom/ACR teams are trying to get the best performance available, by whatever means is available. But that task is not nearly as straightforward as you seem to imagine. And when it comes to using GPUs, the task is complicated by all the horrid video card drivers out there.

Votes

Translate

Translate

Report

Report
Engaged ,
Jan 02, 2014 Jan 02, 2014

Copy link to clipboard

Copied

Chris, if doing the actual Raw conversion by the GPU is so complicated, would it be as complicated to only do the image display with the help of the GPU? So that we can get the smooth and fast image display that Photoshop offers with all the goodies like flick-panning, animated zoom, birds eye view, rotation etc?
IMO the whole image navigation in LR and ACR are really bad and slow compared to Photoshop.

Votes

Translate

Translate

Report

Report
Participant ,
Jan 14, 2014 Jan 14, 2014

Copy link to clipboard

Copied

Well, do you hear Adobe... there are already alternatives - it is about time to make something about it!

Votes

Translate

Translate

Report

Report
Participant ,
Jan 15, 2014 Jan 15, 2014

Copy link to clipboard

Copied

I wonder if this is still true for LR 5.3 I really don't use those as well.

And as I understand it is quite reversible if needed.

Currently my workstation is for an upgrade and I can't check this solution's effectivness. 😞

Votes

Translate

Translate

Report

Report
Participant ,
Jan 15, 2014 Jan 15, 2014

Copy link to clipboard

Copied

Hi everyone,

I find every LR feature just great, however I believe that it is

cruitial for all of us to get performance.

What LR is still missing is:
- shortcutkeys mapping as in PS (especially an option for keys mapping on custom presets)
- performance boost especially for retouching using LR tools

The latter was topic of this discusion that began 2 years ago...

I found the following post from a DEV (I guess):
http://forums.adobe.com/message/4129233 Post #17

"GPU accelerated screen redraw in PS was a significant effort and
required many months of development and testing."

YEAH! Since this post 2 years passed and we are on LR CC still awaiting it.

The explanation:
"Even if we did that, it wouldn't really help the user experience
because our pipeline is different than in Ps (in Ps the pixels are
effectively baked; in ours, we do all our rendering on-the-fly since
the editing is parametric -- our bottleneck is NOT the display
path!)."

If it is NOT, then please DO optimize the code perfomancewise.
I find it unaccepetable to experience lag on my i7 (3770k OC @ 4GHz
equipped with 2x8GB of fast 1600MHz RAM and SSD) machine with 10 MP
raw files. That is 10MP not 20MP. Let us NOT mention any better
resolution.

Everytime I retouch blemishes and skin with adj. brush I get that
LAG. When I I am editing a single shot - PS is my choice of a weapon. But for some projects I am paid in number of shots delivered. Then the correct tool is LR (or LR alternatives if this isn't fixed soon enough).

It is time for a new and improved LR CC with better performance. In the age of 20MP minimum and MediumFormats and Hi-Res images as industry standart now LR brought the right tools. Now it is time to make those stop being sluggish.

Votes

Translate

Translate

Report

Report
Community Beginner ,
Jan 15, 2014 Jan 15, 2014

Copy link to clipboard

Copied

I agree it's time for some significant improvements in performance.

Personally I find the explanation from Adobe that "it wouldn't really help the user experience because our pipeline is different than in Ps...our bottleneck is NOT the display path!" completely laughable. It shows a complete lack of understanding of GPGPU capabilities, and what was done in MPE by the person who wrote that.

So does GPGPU programing solutions do what we need, and does it do it faster than CPU solutions? Yes, and yes. We need on the fly rendering, GPGPU could do that (that is what the MPE does). We need faster initial renderings for imported image previews, GPGPU could do that. We need fast database queries on large databases to get all the parametric information for an image, GPGPU can do that.

Yes, there are some disk access bottlenecks affecting speed, but there are areas that GPGPU could provide some significant benefits.

I believe the real reason the LR hasn't gotten any GPGPU implementations is the lack of any well developed OpenCL or CUDA libraries for LUA (the underlying language of LR). I doubt Adobe will make any effort to develop write a OpenCL or CUDA library for LUA so they could put more effort into putting GPGPU features into LR.

Votes

Translate

Translate

Report

Report
Community Beginner ,
Jan 15, 2014 Jan 15, 2014

Copy link to clipboard

Copied

"...compression methods used in camera RAW files do not lend themselves to parallel processing, slowing the pipeline. "

Sounds like Camera RAW needs some work to open the "pipeline" to more parallel processing via CPU and or GPU. I'm sure figuring out how to take a process that does not lend itself to parallel processing and make it a parallel processing process will be quite the challenge, but is definitely needed.

Votes

Translate

Translate

Report

Report
Participant ,
Jan 15, 2014 Jan 15, 2014

Copy link to clipboard

Copied

Additionally to GPGPU features there are some advices for LR speed up that I found here:

http://feedback.photoshop.com/photosh...

Votes

Translate

Translate

Report

Report