Exit
  • Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
  • 한국 커뮤니티
0

How long should AI Denoise take on one photo

New Here ,
Sep 07, 2025 Sep 07, 2025

Hi,

I am looking to upgrade my Windows PC and was hoping someone may be able to tell me how long it should take to Denoise one 45MB photo in Lightroom using the recommended spec'ed PC for Lightroom Classic? My current PC takes about 14 minutes per photo.

TOPICS
Windows
297
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
Sep 07, 2025 Sep 07, 2025

14 minutes!!!  Wow, that is unusable IMO.  More typical performance would be 14 seconds, or less. For my Intel/RTX4090 laptop I see about 10 sec and my AMD/RTX3090 desktop is about 6-8 seconds.  

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Sep 08, 2025 Sep 08, 2025

Thank you very much for your reply as you have given me a benchmark of what I can achieve and look forward to when I buy my new beefed up PC

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Sep 07, 2025 Sep 07, 2025

How long it takes depends on the power of the GPU and the number of megapixels (not megabytes) per image. (Megabytes don't help for comparison because 45MB can represent a variety of image sizes, because of variables such as compression type used, if any, and bit depth.)

 

There have been some tests around the web where multiple people have downloaded the same 60MP sample raw file and run Denoise on the systems they own. In general, what was found is:

Older/weaker GPUs can take 5 to 45 minutes per image.

Recent low/midrange GPUs can take 1 to 3 minutes per image.

The fastest current GPUs in PCs and Macs can take under 10-15 seconds per image.

 

The speed for your images depends on how many megapixels your raw files are compared to 60MP. For example, if your files are 20MP (which is 1/3 of 60), then you can assume Denoise times for your 20MP images would be about 1/3 of the time estimates above.

 

For Windows PCs, those who need maximum Denoise performance have been using the upper end Nvidia discrete GPUs, like the 30x0/40x0/50x0 series.

 

For Denoise, CPU performance and RAM amount don’t make much difference. Only the GPU does.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Sep 08, 2025 Sep 08, 2025

Really appreciate your indepth reply as the GPU specs will guide me when I buy my new Windows 11 Desktop to replace my Windows 10 Desktop which I am told cannot be upgraded to Windows 11

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Sep 08, 2025 Sep 08, 2025

Please post a copy of the LrC System Info file. You can obtain this file via the LrC Help > System Info menu item. There's a copy button at top right corner of the System Info dialog. Use this button to capture the info then paste into a forum post.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Sep 08, 2025 Sep 08, 2025

Hi Ian,

I have left out the second half which contained the GL_ARB settings as these don't look to be relevant:

 

Lightroom Classic version: 14.5.1 [ 202508231203-c2638d01 ]

License: Creative Cloud

Language setting: en

Operating system: Windows 10 - Home Premium Edition

Version: 10.0.19045

Application architecture: x64

System architecture: x64

Logical processor count: 6

Processor speed: 2.8GHz

SqLite Version: 3.36.0

CPU Utilisation: 5.0%

Power Source: Plugged In

Built-in memory: 16236.6 MB

Dedicated GPU memory used by Lightroom: 769.4MB / 128.0MB (601%)

Real memory available to Lightroom: 16236.6 MB

Real memory used by Lightroom: 1492.7 MB (9.1%)

Virtual memory used by Lightroom: 7292.8 MB

GDI objects count: 1058

USER objects count: 4021

Process handles count: 2005

Memory cache size: 161.6MB

Internal Camera Raw version: 17.5 [ 2318 ]

Maximum thread count used by Camera Raw: 4

Camera Raw SIMD optimization: SSE2,AVX,AVX2

Camera Raw virtual memory: 2258MB / 8118MB (27%)

Camera Raw real memory: 2675MB / 16236MB (16%)

 

Cache1:

Final1- RAM:352.0MB, VRAM:0.0MB, _D9A4756.CR3

Final2- RAM:468.0MB, VRAM:0.0MB, _D9A4746.CR3

NT- RAM:820.0MB, VRAM:0.0MB, Combined:820.0MB

 

Cache2:

m:161.6MB, n:704.7MB

 

U-main: 95.0MB

 

System DPI setting: 96 DPI

Desktop composition enabled: Yes

Standard Preview Size: 1440 pixels

Displays: 1) 1920x1080

Input types: Multitouch: No, Integrated touch: No, Integrated pen: No, External touch: No, External pen: No, Keyboard: No

 

Graphics Processor Info:

DirectX: Intel(R) UHD Graphics 630 (31.0.101.2112)

Init State: GPU for Display supported by default

User Preference: Auto

Enable HDR in Library: OFF

GPU for Preview Generation: Auto (S1_0)

 

Application folder: C:\Program Files\Adobe\Adobe Lightroom Classic

Library Path: D:\Caroline\Pictures\Lightroom\Lightroom Catalog.lrcat

Settings Folder: C:\Users\Caroline\AppData\Roaming\Adobe\Lightroom

 

Installed Plugins:

1) AdobeStock

2) Flickr

3) Topaz Photo AI

 

Config.lua flags:

 

Adapter #1: Vendor : 8086

                Device : 3e92

                Subsystem : 86941043

                Revision : 0

                Video Memory : 128

Adapter #2: Vendor : 1414

                Device : 8c

                Subsystem : 0

                Revision : 0

                Video Memory : 0

AudioDeviceIOBlockSize: 1024

AudioDeviceName: System Default - Speakers (Realtek High Definition Audio)

AudioDeviceNumberOfChannels: 2

AudioDeviceSampleRate: 48000

Build: LR5x26

Direct2DEnabled: false

GL_ACCUM_ALPHA_BITS: 16

GL_ACCUM_BLUE_BITS: 16

GL_ACCUM_GREEN_BITS: 16

GL_ACCUM_RED_BITS: 16

GL_ALPHA_BITS: 8

GL_BLUE_BITS: 8

GL_DEPTH_BITS: 24

GL_GREEN_BITS: 8

GL_MAX_3D_TEXTURE_SIZE: 2048

GL_MAX_TEXTURE_SIZE: 16384

GL_MAX_TEXTURE_UNITS: 8

GL_MAX_VIEWPORT_DIMS: 16384,16384

GL_RED_BITS: 8

GL_RENDERER: Intel(R) UHD Graphics 630

GL_SHADING_LANGUAGE_VERSION: 4.60 - Build 31.0.101.2112

GL_STENCIL_BITS: 8

GL_VENDOR: Intel

GL_VERSION: 4.6.0 - Build 31.0.101.2112

GPUDeviceEnabled: false

OGLEnabled: true

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Sep 09, 2025 Sep 09, 2025

Graphics Processor Info:

DirectX: Intel(R) UHD Graphics 630 (31.0.101.2112)

Init State: GPU for Display supported by default

User Preference: Auto

 

 

Thanks for the information. I've highlighted two lines from your System Info.

 

The first line is to suggest that you check whether a more recent driver for your GPU is available. The one you're using is over a year old. 

 

The second highlighted line is really the reason for your issue. When the GPU is set for Auto the test carried out be LrC when it launches have determined that your GPU is only capable of accelerating the DIsplay. It doesn't have the power or minimum VRAM to support accelerated 'Image Processing', 'Exporting',  or any of the AI based features such as 'Denoise'.

 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Sep 17, 2025 Sep 17, 2025

I have the fastest specced M2 Max MacBook 14in (12 core CPU & 38 core GPU) and it averages 15sec for 45 MP Raw photos with Auto settings, HDR, and optic corrections when I denoise. Imagine the m4 max is twice that speed. 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Sep 18, 2025 Sep 18, 2025

If the M2 and M4 being compared are both Max level processors then they have roughly equal numbers of GPU cores, so the more significant difference in the generations is the per-core GPU performance difference between the generations. From some quick web searches, it looks like the M4 GPU might be around 15% to 40% faster than the M2 GPU, but not twice as fast.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Sep 18, 2025 Sep 18, 2025

As @Conrad_C  has mentioned, the time difference between Denoising when using a M2 Max is not going to be double that of an M4 Max. Actually, the difference is marginal. By way of example, my 14 inch M4 Max indicates that it should take around 9 seconds to denoise  a 45MP Canon EOS R5 file. It actually it takes 14 seconds. My M3 Ultra with double the CPU and GPU cores of the M4 Max takes 11 seconds. 

 

It's worth noting that Denoise was orinally developed to utilise the Apple Neural Engage. It did for a relatively short period, and performance was noticably better. Unfortunately, quality issues mean that the neural engine is no longer used.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Sep 18, 2025 Sep 18, 2025
LATEST

Just FYI for anyone comparing Mac and Windows, although the Apple Neural Engine NPU is no longer used by Adobe Denoise, at least they tried it for a while before backing out of it. On Windows, I don’t think Adobe Denoise has ever used the NPU on the Copilot+ PCs.

 

On both Macs and Windows PCs, Denoise performance is driven by the GPU only. No other components contribute significantly to Denoise performance. Buying the maximum spec CPU available won’t improve Denoise performance, and neither will maxing out RAM. But buying a better GPU will.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines