Copy link to clipboard
Copied
I am running Lightroom Classic 14.2 on a PC with a Ryzen 9950X processor, Radeon 6800XT GPU, PCIe Gen5 NVMe, and 64GB RAM. I have noticed that during 1:1 preview generation, the percentage of CPU being used is primarily below 50% after several minutes of runtime (in the case I'm running now, this started after 5 minutes or so, and the generation is about 10 minutes in at this point. I am generating 1:1 previews for a batch of 4750 Canon R5 and R5II RAW files as a performance test. I previously deleted the Lightroom preview cache and optimized the catalog.
The CPU temperature seems within limits (around 80-85C, but sometimes even lower), and core clock is showing up as running at 100% maximum frequency in the Windows Resource Monitor. There is plenty of free disk space (over 2.2 TB free).
Is this slowdown in core utilization expected? I just moved from an older PC with a Ryzen 5600X that also exhibited a similar issue. I was hoping to see more of an improvement with the new PC. System info below.
Thanks!
------
Copy link to clipboard
Copied
"Init State: GPU for Export supported by default
Copy link to clipboard
Copied
The OP is building 1:1 previews not exporting the images. Also, preview building is currently handled by the CPU.
Copy link to clipboard
Copied
@Ian Lyons is correct - I am building 1:1 previews. My understanding is that the GPU is not involved. The GPU was also almost completely idle during preview generation.
Copy link to clipboard
Copied
[This post contains formatting and embedded images that don't appear in email. View the post in your Web browser.]
Right, thanks for the correction, I don't know why I thought you were doing exports.
Make sure you have the option Preferences > Performance > Generate Previews In Parallel selected. On my Mac building 1:1 previews of raws, CPU utilization is about 50% with that option off and close to 100% with it on. In the screenshot below, you can see precisely when I set it on while building previews:
Copy link to clipboard
Copied
"Generate Previews in parallel" was already selected, so unfortunately that is not the cause. I should also mention that these 1:1 previews were of "finished" images each with many edits and noise reduction.
Copy link to clipboard
Copied
"DirectX: AMD Radeon RX 6800 XT (32.0.13031.3015)"
It likely isn't related to this issue, but there is a slightly newer graphics driver available from 3/20/25 (32.0.13031.4034):
Copy link to clipboard
Copied
Thanks to @Ian Lyons and @johnrellis for your replies so far. My current theory after further experiments is that there may be a memory leak.
Based on John's last comment, I reconfirmed that parallel preview generation was set. I disabled that and ran a small batch of 50 photos. CPU utilization was around 20%. I then reenabled preview generation. I was able to see a small batch of 50 photos generate previews in parallel with 100% CPU utilization. I then tried a batch of 1000 photos. Those 1000 photos were running at 100% CPU utilization until about 656 previews in, then utilization went down (and process memory went down about 11GB). Eventually, memory usage crept back up. It took 14-15 minutes total to generate these previews.
Finally, I tried a second (different) batch of 1000 photos to generate 1000 previews for. The second batch also started off at 100% utilization but started slowing down around 4 minutes in. System memory usage pushed up to 44.4GB (out of 64GB total), then perf dropped and memory usage dropped. System memory usage kept bouncing between 42GB and 44GB, and CPU utilization dropped each time that happened. I eventually noticed the Lightroom.exe process was reaching 32GB, then dropping 2-3GB, then reaching 32GB again. Each time the memory usage dropped, CPU utilization dropped temporarily. But it was happening frequently so it was impacting performance. After the second batch of 1000 photos finished (16m30s or so) Lightroom.exe was using 28,881,236K (~28GB) of memory.
I can rerun my initial experiment with the full 4750 previews, but I suspect the memory issue may have been happening during that run, made even worse by the large number of photos being processed. These photos have a large number of edits, AI selections, etc and most have noise reduction applied.
Copy link to clipboard
Copied
[This post contains formatting and embedded images that don't appear in email. View the post in your Web browser.]
"System memory usage pushed up to 44.4GB"
That is unusually high. I ran a test on my Mac LR 14.2, building 1:1 previews for 4245 raws (proprietary and DNGs), with lots of masking, AI Remove, and Denoise. CPU utilization was constantly near 100%, and memory bounced between 3 GB and 5 GB:
If you can identify a small set of photos that repeatedly causes this high memory usage, it would be worthwhile to file a bug report and make the photos available to Adobe.
Copy link to clipboard
Copied
I believe Windows and Mac OS manage memory differently. How much memory would preview generation of a large batch of files be expected to use on Windows?
Copy link to clipboard
Copied
I did a quick test on my Windows 11 test machine with 784 random raws. Task Manager reported LR using up to 12 GB of memory, far less than 40+ GB you're seeing, and CPU utilization was consistently above 90%.
Copy link to clipboard
Copied
What type of raw files, what CPU, and how much system memory were used? I suspect the size of the raw files would impact memory usage significantly, and the total amount of system memory would likely impact how much memory LRC chooses to use as a max limit. I ask about the CPU in case there could be some difference between Intel or AMD processors. My Canon R5 and R5II RAW files are 45MP, and my AMD Ryzen 9950x system has 64GB total RAM.