Copy link to clipboard
Copied
Howdy!
I have a fairly high powered machine. AMD 3950x (16core, 3.5GHz), 64GB Ram, Nvidia 2080 super (8gb vram), and dripping in Samsung 990 Pro M.2's. (Admittedly it was more of a beast in 2020 when I built it. 😉 It screams when I edit in in Premiere, simply tears through ProRes 4k video, both CPU and GPU ramp up to 90-100% utilization and run for hours.
The problem occurs when I use Lightroom Classic...
I have an archive of 250k images on a Synology NAS (my archive for old projects) and active projects live on a 2TB Samsung 990 Pro M.2 (connected to the motherboard X570 Master) while I'm editing them. When I import/render previews (say 4k images from a shoot), it crawls...But not because the CPU and GPU are maxed out, as they run at 6-10% CPU and 1-4% GPU. Lightroom just doesn't seem to be trying. It's like it doesn't fully utilize my CPU and GPU. The only time I can get LR to utilize 100% CPU is when I'm scrolling through the grid and images that don't have previews come into view, then it will start working hard even going up to 100% CPU. But if I were to select all of those images and force it to render previews, LR would take 5 minutes to queue the task, then take 9 hours or so to render 700 images on Standard Preview.
Things I've tried...
One of the big clues that this might be an issue was the purchase of a 2021 Macbook Pro M1 max. I started editing on the road more. I noticed that the Macbook would utilize 90-100% CPU when using LR. Imports, renders, exports, you name it... If it needed to work, it worked hard. Granted, I was using small catalogues for each shoot (4-12k images).
Any help would be much appreciated.
System Info:
Copy link to clipboard
Copied
I don't know the reason why your CPU isn't being used more heavily on Import. However, Lightroom Classic does not use the GPU at import, and so what you are seeing for the GPU is expected.
Copy link to clipboard
Copied
Good point. I was more thinking of import imediately followed by preview rendering. Not that it would be maxing out the GPU, but just engaging it more.
Copy link to clipboard
Copied
Library Path: G:\Dropbox\SAFE\Lightroom Files\Catalogue Files Segmented Years\KWP LCat 2020-
Adobe does not support LrC catalogs on a network share, be that a local server, a NAS, or the Cloud. The Cloud includes DropBox.
Some make this work. Some have better luck with DropBix than other Cloud services. But eventually ir fails, especially performance.
IF that Library path actually just indicates that the catalog is being synced to DropBox, then stop that, each and every edit will cause communication to DropBox, slowing down performance. Remover that LrC is a parametric editor, and as such each and every edit gets compounded on what proceeded,
P.S. Adobe does apparently support photos on a Server Share, a NAS, or the Cloud. But many find they have issues later.
Copy link to clipboard
Copied
Great points. That was weighing on me as a culprit for a few months. Then, I decided to switch the LrCat files to a non-synced drive...the low cpu utilization persisted. Also, I'd think that theory would also mean more CPU usage. E.g. Processor running at 100% and LR is running slow. Whereas my issue is LR running slow and CPU not spooling up.
As for LR not cooperating with a NAS... I keep my LrCat on an internal M.2 Drive, whilst storing old projects on the NAS. Active projects are on another M.2 internal drive. Both have similar low CPU utilization. I've not ruled this out as an issue though. I just don't think I can go back to less redundant external hard drives. But if that solved the issue...maybe I would. The problem is that I have about +15TB of photos, so testing that theory is not cheap or quick.
Great input. Thank you for your time. I appreciate it.
-K
Copy link to clipboard
Copied
Howdy! I added an update to this post. I'd love some feedback, when you have time. I believe it's all about syncing collections with Adobe cloud.
Copy link to clipboard
Copied
One thing to check: Is Generate Previews in Parallel enabled? It’s in Preferences at the bottom of the Performance tab. I’ve always assumed that if that option was disabled, Lightroom Classic would limit how many cores it uses for preview generation.
For example, your CPU has 16 cores. If Lightroom Classic is processing previews serially instead of in parallel, meaning it’s restricting itself to one or maybe two cores, then 100% divided by 16 cores and multplying by 1 or 2 would be…around 6-12% similar to what you said.
If you find that Generate Previews in Parallel is actually enabled on your system, then I guess you can throw out this theory and look at something else…
If you find that Generate Previews in Parallel is disabled on the Windows PC and enabled on the Mac, that could explain the difference.
Overall, this is what I would expect in Lightroom Classic:
For both video editing and still image photography, the more a feature is GPU-accelerated, the less I expect the CPU to be needed, so for example during exporting in Premiere Pro or Lightroom Classic, I’m not surprised at all if GPU use hits the ceiling while CPU use drops.
This also applies to hardware accelerators. For video rendering, if the codec being used is supported by a codec-specific hardware-based accelerator in the CPU/GPU, the CPU cores can end up with not much to do because they won’t be able to process the data as fast as the hardware-based accelerator. This is also very true for the computationally expensive AI Denoise on the Mac, which can now also be accelerated by the Apple Neural Engine. With AI Denoise being processed by the GPU primarily and the ANE helping out, there is really not much left that the CPU can constructively help with, so the CPU is typically largely idle during AI Denoise.
The Apple Neural Engine is an AI coprocessor/accelerator in hardware. This used to be a weird edge case, but not any more: With the rise of AI features, the new Microsoft Copilot+ PC hardware standard specifies that a Copilot+ PC must have a Neural Processing Unit or NPU and it needs to meet a specific performance threshold.
So where we used to think CPU/GPU, and now we think CPU/GPU/other accelerators, soon we will think of hardware usage percentages in terms of the mix of CPU, GPU, NPU, and other accelerators. And the more a specific workload is more efficiently processed by the other processing units, the more likely that all CPU cores will not be used for that particular workload.
Copy link to clipboard
Copied
Great suggestion. I checked that too. It's has been running parallel. I checked that a while back and even wondered if I should turn that off. But your explanation makes sense as to why leaving that on would be better.
Copy link to clipboard
Copied
UPDATE:
I decided to throw more time into a wild idea. I decided to start a new catalogue on my Macbook Pro 2021 M1 Max, 64GB. I thought it would take days to rebuild a new catalogue with the 250k images (like it does on my PC)...But... my Macbook looks to be making quick work of the task. I'd estimate it will take no more than 9 hours. I'll report back tomorrow on whether it will perform as quickly once it's finished with the LrCat build.
Copy link to clipboard
Copied
UPDATE:
So I rebuilt the catalogue on my macbook pro and everything was running great. CPU and GPU utilization was great, nearly maxed during any task. I was nearly sold on the Mac being better at handling the LrCat. THEN things turned. I love having photos synced to the cloud. I love being able to access photos across my devices. BUT... This seems to be the culprit. As soon as I switched the cloud sync to the new Macbook Catalogue, CPU and GPU utilization dropped on the Macbook. Suddenly all the tasks took forever. Renders, reading metadata, converting to DNG, you name it...It now barely crawled.
To make certain of this, I jumped back onto my PC onto the old Catalogue (now disconnected from cloud sync). AND it now utilizes 100% of the CPU! Clearly something with the cloud sync is kneecapping LR's ability to utilize the CPU.
Why oh why can't I have LR sync to cloud?
PS. I have erased my entire cloud storage a couple of times during this process, as I've heard gremlins can occur when switching synced catalogues. Erasing my cloud storage did not affect the issue I'm having.