Copy link to clipboard
Copied
Hi everyone,
I wanted to share some additional information regarding GPU support in Lr CC.
Lr can now use graphics processors (GPUs) to accelerate interactive image editing in Develop. A big reason that we started here is the recent development and increased availability of high-res displays, such as 4K and 5K monitors. To give you some numbers: a standard HD screen is 2 megapixels (MP), a MacBook Retina Pro 15" is 5 MP, a 4K display is 8 MP, and a 5K display is a whopping 15 MP. This means on a 4K display we need to render and display 4 times as many pixels as on a standard HD display. Using the GPU can provide a significant speedup (10x or more) on high-res displays. The bigger the screen, the bigger the win.
For example, on my test system with a 4K display, adjusting the White Balance and Exposure sliders in Lightroom 5.7 (without GPU support) is about 5 frames/second -- manageable, but choppy and hard to control. The same sliders in Lightroom 6.0 now run smoothly at 60 FPS.
So why doesn't everything feel faster?
Well, GPUs can be enormously helpful in speeding up many tasks. But they're complex and involve some tradeoffs, which I'd like to take a moment to explain.
First, rewriting software to take full advantage of GPUs is a lot of work and takes time. Especially for software like Lightroom, which offers a rich feature set developed over many years and release versions. So the first tradeoff is that, for this particular version of Lightroom, we weren't able to take advantage of the GPU to speed up everything. Given our limited time, we needed to pick and choose specific areas of Lightroom to optimize. The area that I started with was interactive image editing in Develop, and even then, I didn't manage to speed up everything yet (more on this later).
Second, GPUs are marvelous at high-speed computation, but there's some overhead. For example, it takes time to get data from the main processor (CPU) over to the GPU. In the case of high-res images and big screens, that can take a LOT of time. This means that some operations may actually take longer when using the GPU, such as the time to load the full-resolution image, and the time to switch from one image to another.
Third, GPUs aren't best for everything. For example, decompressing sequential bits of data from a file -- like most raw files, for instance -- sees little to no benefit from a GPU implementation.
Fourth, Lightroom has a sophisticated raw processing pipeline (such as tone mapping HDR images with Highlights and Shadows), and running this efficiently on a GPU requires a fairly powerful GPU. Cards that may work with in the Photoshop app itself may not necessarily work with Lightroom. While cards that are 4 to 5 years old may technically work, they may provide little to no benefit over the regular CPU when processing images in Lr, and in some cases may be slower. Higher-end GPUs from the last 2 to 3 years should work better.
So let's clear up what's currently GPU accelerated in Lr CC and what's not:
First of all, Develop is the only module that currently has GPU acceleration whatsoever. This means that other functions and modules, such as Library, Export, and Quick Develop, do not use the GPU (performance should be the same for those functions regardless of whether you have GPU enabled or disabled in the prefs).
Within Develop, most image editing controls have full GPU acceleration, including the basic and tone panel, panning and zooming, crop and straighten, lens corrections, gradients, and radial filter. Some controls, such as local brush adjustments and spot clone/heal, do not -- at least, not yet.
While the above description may be disappointing to some of you, let's be clear: This is the beginning of the GPU story for Lightroom, not the end. The vision here is to expand our use of the GPU and other technologies over time to improve performance. I know that many photographers have been asking us for improved performance for a long time, and we're trying to respond to that. Please understand this is a big step in that direction, but it's just the first step. The rest of it will take some time.
Summary:
1. GPU support is currently available in Develop only.
2. Most (but not all) Develop controls benefit from GPU acceleration.
3. Using the GPU involves some overhead (there's no free lunch). This may make some operations take longer, such as image-to-image switching or zooming to 1:1. Newer GPUs and computer systems minimize this overhead.
4. The GPU performance improvement in Develop is more noticeable on higher-resolution displays such as 4K. The bigger the display, the bigger the win.
5. Prefer newer GPUs (faster models within the last 3 years). Lightroom may technically work on older GPUs (4 to 5 years old) but likely will not benefit much. At least 1 GB of GPU memory. 2 GB is better.
6. We're currently investigating using GPUs and other technologies to improve performance in Develop and other areas of the app going forward.
The above notes also apply to Camera Raw 9.0 for Photoshop/Bridge CC.
Eric Chan
Camera Raw Engineer
Copy link to clipboard
Copied
Actually nothing feels faster on LR CC. Check on my post which includes videos about it.
Have 7 full CC seats , and really not happy at all with this release. Even on my HP Z820 with 128 GB or RAM and a Quadro 6000 LR CC crawls compared to LR 5.7.1
Copy link to clipboard
Copied
It is worth changing the preview size to the new Auto settings, if required.
From the LR menu go to:
Edit >> Catalog Settings >> File Handling (tab)
Lightroom >> Catalog Settings >> File Handling (tab) on Mac
For Standard Preview Size choose Auto from the drop-down list.
This is the default setting if you are using LR6 for the first time and it’s particularly useful for retina and HiDpi displays but will optimize LR for any monitor. For those using LR5 or previous versions, it is recommended that the preview size is changed to the Auto setting.
Copy link to clipboard
Copied
I'd 2nd that, new version seems to be a step backwards
Copy link to clipboard
Copied
Dear Eric,
Thank you for your exhaustive explanation about some very obvious issue here. But the simple basic common sense makes me still wonder:
After opening and start doing the everyday tasks on my poor 1920x1080 screen resolution with my very poor GTX 680M GPU - having 2 of them in SLI, but I guess the second one is left to rest in peace - not to mention that the SLI mode was already existing in LR4 times - why the very same things are SLOWER on LR6 compared with the 5 version.
Is the price we have to pay to get better PP pics out of this version? I still couldn't see a visible difference, but maybe are just my eyes not enough trained to see some.
I didn't see anybody complaining about the missing of some stellar speeds here, is just about to have (at least) the very same speed as the previous version.
Hopefully LR6 will get in the right track after some times.
Copy link to clipboard
Copied
Theres one thing I don't understand: You write that local brush doesn't use GPU. But how come that when I switch on GPU support in the settings that the brush is soooo much slower than without GPU usage? OK, my video card is not a rocket but I have trouble to understand it...
Oli
Copy link to clipboard
Copied
I see a reasonable explanation in Eric's post:
Second, GPUs are marvelous at high-speed computation, but there's some overhead. For example, it takes time to get data from the main processor (CPU) over to the GPU. In the case of high-res images and big screens, that can take a LOT of time. This means that some operations may actually take longer when using the GPU, such as the time to load the full-resolution image, and the time to switch from one image to another.
While cards that are 4 to 5 years old may technically work, they may provide little to no benefit over the regular CPU when processing images in Lr, and in some cases may be slower. Higher-end GPUs from the last 2 to 3 years should work better.
Bottom line - if OpenGL doesn't speed things up, just turn it off. If that's still slower than 5.7, there's a legitimate reason for concern. Otherwise you're no worse off.
The way I read this, the main immediate priority was to make 4K and 5K usable. I trust further optimization for the rest of us will come in future releases.
Copy link to clipboard
Copied
Tested it also with GPU support off and it is still slower than LR 5.7.1 I posted videos comparing in my original post about poor performance on LR 6
Sent from the Moon
Copy link to clipboard
Copied
I posted the following information on the Luminous Landscape forum (GPU used in Develop but not Library?) in response to comments you made there.
I am very puzzled by the extremely blurry image in the second screen capture when the GPU is enabled.
OS X (10.9.5)
Hardware configuration:
MacPro (late 2013)
AMD FirePro D300 2048 MB
Apple Cinema Display 1920 x 1200
16 GB RAM
1 TB SSD
Test file: Nikon D800 NEF, 50 MB
(0) open the Develop module
(1) select a different NEF file and zoom to 1:1
(2) clear the ACR cache
(3) select the test file
(4) take 3 screenshots to illustrate the 3 display states (the first one is hard to capture)
(5) select another image
(6) same 3 states are present
(7) return to the test file and the same 3 display states are present
Why isn’t the ACR cache coming into play in step 7?
If I repeat this process with the GPU disabled the image is displayed without the intermediate states.
I have attached the 3 screenshots mentioned in step (4).
Copy link to clipboard
Copied
Bob_Peters wrote:
I am very puzzled by the extremely blurry image in the second screen capture when the GPU is enabled.
With GPU on, I see a slight color shift instead of the blurry image, then the color becomes properly rendered on-screen. With GPU off, I don't see this at all.
Copy link to clipboard
Copied
Thanks for the information on GPU acceleration. You state that newer cards with 2GB of memory are recommended. My card is 2 years old and has 2GB but I see no improvement in speed. It would be helpful if there were some more specific recommendations for the cuda cores, bus speeds, and memory that would result in improved performance. I would upgrade if I knew what would make a difference.
Copy link to clipboard
Copied
iI would wait until there is more optimization. I have a Quadro 6000 that is one of the most powerful cards and performance on anything other than zoom in and zoom out is faster with the GPU turned off. I think there is a lot to be done yet.
Copy link to clipboard
Copied
Thanks for the suggestion. Unless I see something more specific from Adobe I will follow your advice. No sense spending a few hundred until there is something to be gained. In particular since I don't have a 4K or 5K display.
Copy link to clipboard
Copied
You state that newer cards with 2GB of memory are recommended. My card is 2 years old and has 2GB but I see no improvement in speed.
MadManChan2000 also said "The GPU performance improvement in Develop is more noticeable on higher-resolution displays such as 4K. The bigger the display, the bigger the win."
Copy link to clipboard
Copied
> Some controls, such as local brush adjustments and spot clone/heal, do not -- at least, not yet. (have full GPU acceleration) For me: (Windows 8 x64, I7-920, 8 GB RAM , Nvidia GTX 560 Ti 448, Dell U2713HM (2560 x 1440) I am almost certain that the brush adjustment is impacted by the GPU: If the acceleration is activated, the brush adjustment lags in my case, If the GPU acceleration is not enabled, wahoo the brush adjustment not lag and it is quick as LR5.7.1. or almost So, the local brush adjustment uses the GPU acceleration, but near to be optimized. (for me)
Copy link to clipboard
Copied
So, the local brush adjustment uses the GPU acceleration
Doesn't the evidence you just presented lead to the opposite conclusion?
Copy link to clipboard
Copied
Hi Dj_paige.
MadManChan2000 said in the first post :
"Within Develop, most image editing controls have full GPU acceleration, including the basic and tone panel, panning and zooming, crop and straighten, lens corrections, gradients, and radial filter. Some controls, such as local brush adjustments and spot clone/heal, do not -- at least, not yet."
So My conclusion Is: contrary to what is written by MadManChan200, GPU acceleration affects the brush adjustment .... but not in the good way.
Salvo
Copy link to clipboard
Copied
The only speed gain I see with GPU ON is zoom in, zoom out, and PAN, other than that everything else lags behind 5.7.1 even with GPU off.
Sent from the Moon
Copy link to clipboard
Copied
I'd just like to add that i've also been really disappointed with LR CC speed in the Loupe, which made image culling almost impossible, UNTIL I set the preview size to something other than auto. Even setting it to the smallest option, and medium quality, i see huge speed improvements across the board when in Loupe, with little noticeable render time.
Copy link to clipboard
Copied
I feel like its slower to use GPU acceleration! anyone else on the same boat?
Late 2013 Mac Pro
3.5Ghz 6-core Xeon E5
32GB 1867 DDR3 ECC
Dual AMD FirePro D700 6GB
Using GPU acceleration it takes about 4-5 seconds to load a full res raw from Canon 5d3,
turning GPU acceleration off im averaging 3-4 seconds per image.
(same speed either at 1:1 or Fit )
I thought GPU acceleration supposed to be faster, not all cases?
Thanks!
Copy link to clipboard
Copied
will yang wrote:
Using GPU acceleration it takes about 4-5 seconds to load a full res raw from Canon 5d3,
Some things are slower, yes. Eric said above "it takes time to get data from the main processor (CPU) over to the GPU. In the case of high-res images and big screens, that can take a LOT of time. This means that some operations may actually take longer when using the GPU, such as the time to load the full-resolution image, and the time to switch from one image to another."
Copy link to clipboard
Copied
Thanks.
Would you recommend keeping GPU off for develop mode to do color adjustment on raw images?
and switch GPU on when exporting?
Copy link to clipboard
Copied
will yang wrote:
Would you recommend keeping GPU off for develop mode to do color adjustment on raw images?
Depends on which is highest priority for you. If loading speed is high priority (perhaps because you're doing high volume), then turn it off. If interactive speed is important and you find it better with GPU on, or you're running a very high res screen, turn it on. Export doesn't matter either way.
Copy link to clipboard
Copied
Well, that of leaving it ON for interactivity would be true only if the interactivity was really faster, but is not, not in any of the 7 computers I have tried it on, all with powerful Quadro cards, all the operations involving graduated filters, local brush adjustments, are much slower, like 10 times slower.
Loading the images on the GPU, that has to be badly optimized, the speed from the CPU to the GPU on a PCIe bus like the ones used on the Quadro cards that uses the 16 PCIe lanes is huge, on other cards where may be 4 lines are used we are talking of about 1GB per second. That is enough to move several 50 MP photos in a second.
I'm using other software from Autodesk that is fully GPU accelerated and loading of images on the memory buffer of the GPU is almost instantaneous, let me add that all my computers are connected to a storage that provides sustained 1000MB/s and my laptop runs out of a stripe of 3 Samsung SSDs that gives me about 1450 MB/s and the GPU have 4 GB or RAM.
So it it pretty obvious that this version of LR would be released either without GPU acceleration or with the feature presented as an experimental feature. Add to this that turning that feature OFF does not makes LR CC as fast as LR 5.7.1 is. LR CC lags behind on pretty much anything I tried, specially the local adjustment brushes.
Sent from the Moon
Copy link to clipboard
Copied
Victor, that's not been my experience. I also have an NVIDIA Quadro card for testing (in my case, a K5000) and Develop is several times faster in the areas I mentioned above using this card with GPU acceleration on, compared to without. Note that there are many generations of Quadro cards, and just because the cards you've tried work well in other applications it does not indicate that they will necessarily work well with Lightroom.