• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
1

RAW Editing Performance/Input Delay Lightroom Classic vs. Camera Raw

New Here ,
Apr 30, 2021 Apr 30, 2021

Copy link to clipboard

Copied

I normally use Lightroom for my workflow and it works very well in general, but I was messing with Camera RAW today and saw that input delay is noticeably faster/lower than in Lightroom Classic.

 

If I move sliders such as color temperature, exposure or color grading too fast on Lc (just as examples), it applies the changes only when I stop to move the mouse for a tenth of a second. The delay isn't there on Camera RAW, it changes the image as I change the slider, like 100% real time.

 

The interesting part is that I measured system load during these edits, and moving the sliders constantly fast, on Lc the CPU gets around 40% usage and GPU about 13%. On Camera RAW, GPU gets almost not usage and CPU goes up to around 70%-80%. Disabling GPU acceleration on Lc really doesnt't seem to change nothing on this scenario.

 

Is the way Lc works different than Camera RAW in a way that could justify this difference in behavior? Or am I doing something wrong on Lc?

 

Editing on Lc on my PC is by no mean bad, performance is great, but it would be excellent to have the same input response performance as in Camera RAW.

 

{Moved from Lightroom Cloud to Lightroom Classic Forum by Moderator} 

Views

99

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
no replies

Have something to add?

Join the conversation