Skip to main content
Inspiring
June 11, 2011
Released

P: GPU & Multiprocessor acceleration

  • June 11, 2011
  • 75 replies
  • 2254 views

It would be great if Lightroom 4 had GPU support for CUDA enabled video cards similar to the Mercury Playback Engine in CS5. That would really speed up performance! Every couple of seconds helps when you are editing 1000s of files.

Saving 2 seconds between images comes out to be an hour of saved time when you are editing 2000 images. I have 12 GB of RAM, and 12 core processors at my disposal and sometimes i have 4 seconds between images.

Multiprocessor support would be great as well!

75 replies

Sean H [Seattle branch]
Known Participant
September 4, 2014
I don't know, like I said, they've continually added new features such as perspective/lens correction which are actually useful. The adjustment brushes, clone tools are quite close to perfect though I'd rather have PS-like healing vs LR's version. That said, it seems like a PERFECT time for LR to really hit the 'optimize' button and dive deeper into parallelizing and look-ahead rendering like what you see in FCPx when you idle the mouse. (in FCPx my CPU and GPU are rarely idle but the UX is snappy). I mean, look at a typical 1000-shot export where you can export 100% in 10 minutes, 500/500 (parallel) in 5 minutes and 333/333/333 in 3 minutes. Sure, the 3-up split does practically lock up the CPU but I'm making coffee. This has been exposed and talked about for years.... give me the TURBO button!
[ ◉"]
VeloDramatic
Participating Frequently
September 4, 2014
Sean to your point about volume. I've shot approx. 350,000 images since 2011, all RAW. I'd have no problem complying with a short list of certified hardware... it's that important.

LR began as a stealth development effort to address the workflow demands of high volume shooting, that raison d'etre seems to have been lost along the way. These threads stand as sorry testimony to the lack of progress in this critical area.
Sean H [Seattle branch]
Known Participant
September 4, 2014
I think one of the devs commented before about GPU support and how it would be hard to make it work with many systems. So.... have certified GPUs for the pro version of LR: I have dual R9 280x which are OpenCL beasts.

BTW... http://www.musemage.com/ <-GPU
[ ◉&quot;]
Known Participant
September 4, 2014
Right on Sean and Michael. I don't know what the solutions are but I'd pay a lot more for LRPro as well. Importing, screen redraw, fast switching between images in the Library and Develop modules without interim previews are pain points for me. I've super fast cards, card readers, the latest hardware, SSDs, etc and the bottleneck is LR. Capture One and PhotoMechanic have clear advantages in these area - not that I want to use them but they prove the potential for this to work better. Would love to provide further feedback and help test...
VeloDramatic
Participating Frequently
September 4, 2014
Amen Sean. I've not commented on these forums for three years, and overall LR performance has waned. I'm routinely generating thousands of RAW files per day on many assignments and all my ongoing hardware investments can't compensate for LR's sluggish performance. All the wasted seconds add up, and clients want even faster turnaround (social media for starters).

Call it Lightroom Pro... charge more and I'll happily pay for it with no new features just 3X-5X performance.
Sean H [Seattle branch]
Known Participant
September 3, 2014
Yeah, I'd like Adobe to stop catering to the least common denominator. I just put together a massive system that blows away new mac pros and it's still not that impressive in LR. Make the min specs be a top end machine with 32GB+ ram and optimize for the professionals who use this software to chew through literally 100k images a year, not the weekend warrior crowd that want to get by with celeron + 1GB ram Dell machines using 5400 rpm drives and 15" monitors. Sure, it would make the software cost more as they'd lose a great deal of their install base, but I'd pay thousands for software that actually was intelligent enough to utilize my hardware vs putting it to waste.
[ ◉&quot;]
areohbee
Legend
September 3, 2014
Once upon a time, Lr kept the last 4 renderings for develop in what was called the "negative" cache - selecting any of the last 4 images in develop module was instant (no loading - ready in ~0.0 seconds). Now, it only keeps the last one. My sense is that it was a bother for them (kinda like concurrent rendering for slideshow), and so they nix'd it.

+1 for resurrection (with ability to do "look-ahead" instead of just "remember-behind"), with some good scheme for controlling ram consumption (currently Lr just uses virtual memory (disk as ram) when out of real memory, nearly always bringing the entire system down when that happens - not a good scheme in my opinion).
Sean H [Seattle branch]
Known Participant
September 3, 2014
My basic understanding is that performance isn't changing (or is getting worse) because they are optimizing at the same pace as they are adding great features. Really, these features are *almost* on par with PS. The more adjustments you apply serially to an image in the Dev module, the longer it takes for them to be applied EACH time you land on the image.

With 32-64GB of ram, couldn't LR pre-render a ton of raw files in the develop module using idle CPU cores while I do adjustments on a file? Or could I prerender a set then come back and work from RAM?

Or, can't you guys give me proxy DNGs like Smart Previews where I'm loading and executing on small files then applying to full DNG on export?
[ ◉&quot;]
Inspiring
September 2, 2014
Who are you asking this question to Eric - me? If so, yes, I have my OS on one SSD and my Lightroom catalog and all the images in that catalog on another SSD. I even created a 12GB RAMdrive this weekend and loaded my entire catalog into that. Slight speed increase in some functions, but the worst slowdowns are still there and are seem to be CPU and software-bound.

I can only describe it as...latency. It's like it takes the program 1-2 seconds so start processing the development settings on an image before it will complete them and show you the image.
MadManChan2000
Adobe Employee
Adobe Employee
September 2, 2014
Are you using a SSD on your Windows machine?