Exit
  • Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
  • 한국 커뮤니티
0

Adding GPU to assist Lightroom speed

Community Beginner ,
May 24, 2024 May 24, 2024

Hello

i have just upgraded my pc in order to work faster with lightroom

Specs:

Intel i5 Gen 14

32 GB ram

Gigabyte motherboard B760m

I thaught lightroom is gonna' "fly" but it isnt, defenitly works faster then old pc but not as I expected.

As I read through the web, I get more confused:

Should I add a graphic card (many sites says it won't help at all since all lightroom work is on the cpu) or do I need to upgrade the cpu to i7?

Much appriciate help with best way to get lightroom to work faster..

TOPICS
Windows
4.3K
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 24, 2024 May 24, 2024

Lightroom Classic has been using a GPU to improve performance in the Develop module for many years now. Almost every edit tool in the Develop module, especially those that use AI, will benefit from the most recent generation of dedicated GPUs such as those from the nVidia 3000 and 4000 series. Exporting images also makes use of the GPU. Currently, preview building is CPU only, but is probably an area that Adobe will seek to make greater use of fast GPUs at some point in the future.

 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 25, 2024 May 25, 2024

I forgot to ask that you provide a copy of your 'System Info'. This is much more helpful to us than a typical users description of their hardware, especially as it includes driver information, etc. You can obtain the info from the LrC Help > System Info menu item. There's a 'Copy' button in ther dialog that appears which can be used to capture the info. You can then just paste it into your next post.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 25, 2024 May 25, 2024

Thank you for your answers

Attaching system info as requested

 

Lightroom Classic version: 13.3 [ 202405092057-40441e28 ]
License: Creative Cloud
Language setting: en
Operating system: Windows 11 - Home Premium Edition
Version: 11.0.22631
Application architecture: x64
System architecture: x64
Logical processor count: 16
Processor speed: 2.4GHz
SqLite Version: 3.36.0
CPU Utilisation: 2.0%
Power Source: Plugged In, 255% 
Built-in memory: 32535.9 MB
Dedicated GPU memory used by Lightroom: 85.2MB / 128.0MB (66%)
Real memory available to Lightroom: 32535.9 MB
Real memory used by Lightroom: 1833.9 MB (5.6%)
Virtual memory used by Lightroom: 2036.8 MB
GDI objects count: 700
USER objects count: 2180
Process handles count: 2009
Memory cache size: 2427.6MB
Internal Camera Raw version: 16.3 [ 1863 ]
Maximum thread count used by Camera Raw: 5
Camera Raw SIMD optimization: SSE2,AVX,AVX2
Camera Raw virtual memory: 738MB / 16267MB (4%)
Camera Raw real memory: 739MB / 32535MB (2%)
 
Cache1: 
Final1- RAM:163.0MB, VRAM:0.0MB, YR6_3937.JPG
Final2- RAM:349.0MB, VRAM:0.0MB, YR5_1329.JPG
NT- RAM:512.0MB, VRAM:0.0MB, Combined:512.0MB
 
Cache2: 
m:2427.6MB, n:512.6MB
 
U-main: 76.0MB
 
System DPI setting: 96 DPI
Desktop composition enabled: Yes
Standard Preview Size: 1920 pixels
Displays: 1) 1920x1200
Input types: Multitouch: No, Integrated touch: No, Integrated pen: No, External touch: No, External pen: No, Keyboard: No
 
Graphics Processor Info: 
DirectX: Intel(R) UHD Graphics 730 (31.0.101.4577)
Init State: GPU for Display supported by default with image processing and export supported in the custom mode
User Preference: Auto
 
Application folder: C:\Program Files\Adobe\Adobe Lightroom Classic
Library Path: C:\Users\mintz\OneDrive\Pictures\Lightroom\Lightroom Catalog-v13-3.lrcat
Settings Folder: C:\Users\mintz\AppData\Roaming\Adobe\Lightroom
 
Installed Plugins: 
1) AdobeStock
2) Flickr
3) Nikon Tether Plugin
 
Config.lua flags: 
 
Adapter #1: Vendor : 8086
Device : 4682
Subsystem : d0001458
Revision : c
Video Memory : 128
Adapter #2: Vendor : 1414
Device : 8c
Subsystem : 0
Revision : 0
Video Memory : 0
AudioDeviceIOBlockSize: 1024
AudioDeviceName: $$$/dvaaudiodevice/SystemDefaultAndEffectiveDeviceName=System Default - Speakers (Realtek(R) Audio)#{comment}DVAAU-4201250: Open the audio hardware preferences page.
AudioDeviceNumberOfChannels: 2
AudioDeviceSampleRate: 48000
Build: LR5x120
Direct2DEnabled: false
GL_ACCUM_ALPHA_BITS: 16
GL_ACCUM_BLUE_BITS: 16
GL_ACCUM_GREEN_BITS: 16
GL_ACCUM_RED_BITS: 16
GL_ALPHA_BITS: 8
GL_BLUE_BITS: 8
GL_DEPTH_BITS: 24
GL_GREEN_BITS: 8
GL_MAX_3D_TEXTURE_SIZE: 2048
GL_MAX_TEXTURE_SIZE: 16384
GL_MAX_TEXTURE_UNITS: 8
GL_MAX_VIEWPORT_DIMS: 16384,16384
GL_RED_BITS: 8
GL_RENDERER: Intel(R) UHD Graphics 730
GL_SHADING_LANGUAGE_VERSION: 4.60 - Build 31.0.101.4577
GL_STENCIL_BITS: 8
GL_VENDOR: Intel
GL_VERSION: 4.6.0 - Build 31.0.101.4577
GPUDeviceEnabled: false
OGLEnabled: true
GL_EXTENSIONS: GL_3DFX_texture_compression_FXT1 GL_AMD_depth_clamp_separate GL_AMD_vertex_shader_layer GL_AMD_vertex_shader_viewport_index GL_ARB_ES2_compatibility GL_ARB_ES3_1_compatibility GL_ARB_ES3_compatibility GL_ARB_arrays_of_arrays GL_ARB_base_instance GL_ARB_bindless_texture GL_ARB_blend_func_extended GL_ARB_buffer_storage GL_ARB_cl_event GL_ARB_clear_buffer_object GL_ARB_clear_texture GL_ARB_clip_control GL_ARB_color_buffer_float GL_ARB_compatibility GL_ARB_compressed_texture_pixel_storage GL_ARB_compute_shader GL_ARB_conditional_render_inverted GL_ARB_conservative_depth GL_ARB_copy_buffer GL_ARB_copy_image GL_ARB_cull_distance GL_ARB_debug_output GL_ARB_depth_buffer_float GL_ARB_depth_clamp GL_ARB_depth_texture GL_ARB_derivative_control GL_ARB_direct_state_access GL_ARB_draw_buffers GL_ARB_draw_buffers_blend GL_ARB_draw_elements_base_vertex GL_ARB_draw_indirect GL_ARB_draw_instanced GL_ARB_enhanced_layouts GL_ARB_explicit_attrib_location GL_ARB_explicit_uniform_location GL_ARB_fragment_coord_conventions GL_ARB_fragment_layer_viewport GL_ARB_fragment_program GL_ARB_fragment_program_shadow GL_ARB_fragment_shader GL_ARB_fragment_shader_interlock GL_ARB_framebuffer_no_attachments GL_ARB_framebuffer_object GL_ARB_framebuffer_sRGB GL_ARB_geometry_shader4 GL_ARB_get_program_binary GL_ARB_get_texture_sub_image GL_ARB_gl_spirv GL_ARB_gpu_shader5 GL_ARB_gpu_shader_fp64 GL_ARB_half_float_pixel GL_ARB_half_float_vertex GL_ARB_indirect_parameters GL_ARB_instanced_arrays GL_ARB_internalformat_query GL_ARB_internalformat_query2 GL_ARB_invalidate_subdata GL_ARB_map_buffer_alignment GL_ARB_map_buffer_range GL_ARB_multi_bind GL_ARB_multi_draw_indirect GL_ARB_multisample GL_ARB_multitexture GL_ARB_occlusion_query GL_ARB_occlusion_query2 GL_ARB_pipeline_statistics_query GL_ARB_pixel_buffer_object GL_ARB_point_parameters GL_ARB_point_sprite GL_ARB_polygon_offset_clamp GL_ARB_post_depth_coverage GL_ARB_program_interface_query GL_ARB_provoking_vertex GL_ARB_query_buffer_object GL_ARB_robust_buffer_access_behavior GL_ARB_robustness GL_ARB_robustness_isolation GL_ARB_sample_shading GL_ARB_sampler_objects GL_ARB_seamless_cube_map GL_ARB_seamless_cubemap_per_texture GL_ARB_separate_shader_objects GL_ARB_shader_atomic_counter_ops GL_ARB_shader_atomic_counters GL_ARB_shader_bit_encoding GL_ARB_shader_draw_parameters GL_ARB_shader_group_vote GL_ARB_shader_image_load_store GL_ARB_shader_image_size GL_ARB_shader_objects GL_ARB_shader_precision GL_ARB_shader_stencil_export GL_ARB_shader_storage_buffer_object GL_ARB_shader_subroutine GL_ARB_shader_texture_image_samples GL_ARB_shading_language_100 GL_ARB_shading_language_420pack GL_ARB_shading_language_packing GL_ARB_shadow GL_ARB_spirv_extensions GL_ARB_stencil_texturing GL_ARB_sync GL_ARB_tessellation_shader GL_ARB_texture_barrier GL_ARB_texture_border_clamp GL_ARB_texture_buffer_object GL_ARB_texture_buffer_object_rgb32 GL_ARB_texture_buffer_range GL_ARB_texture_compression GL_ARB_texture_compression_bptc GL_ARB_texture_compression_rgtc GL_ARB_texture_cube_map GL_ARB_texture_cube_map_array GL_ARB_texture_env_add GL_ARB_texture_env_combine GL_ARB_texture_env_crossbar GL_ARB_texture_env_dot3 GL_ARB_texture_filter_anisotropic GL_ARB_texture_float GL_ARB_texture_gather GL_ARB_texture_mirror_clamp_to_edge GL_ARB_texture_mirrored_repeat GL_ARB_texture_multisample GL_ARB_texture_non_power_of_two GL_ARB_texture_query_levels GL_ARB_texture_query_lod GL_ARB_texture_rectangle GL_ARB_texture_rg GL_ARB_texture_rgb10_a2ui GL_ARB_texture_stencil8 GL_ARB_texture_storage GL_ARB_texture_storage_multisample GL_ARB_texture_swizzle GL_ARB_texture_view GL_ARB_timer_query GL_ARB_transform_feedback2 GL_ARB_transform_feedback3 GL_ARB_transform_feedback_instanced GL_ARB_transform_feedback_overflow_query GL_ARB_transpose_matrix GL_ARB_uniform_buffer_object GL_ARB_vertex_array_bgra GL_ARB_vertex_array_object GL_ARB_vertex_attrib_64bit GL_ARB_vertex_attrib_binding GL_ARB_vertex_buffer_object GL_ARB_vertex_program GL_ARB_vertex_shader GL_ARB_vertex_type_10f_11f_11f_rev GL_ARB_vertex_type_2_10_10_10_rev GL_ARB_viewport_array GL_ARB_window_pos GL_ATI_separate_stencil GL_EXT_abgr GL_EXT_bgra GL_EXT_blend_color GL_EXT_blend_equation_separate GL_EXT_blend_func_separate GL_EXT_blend_minmax GL_EXT_blend_subtract GL_EXT_clip_volume_hint GL_EXT_compiled_vertex_array GL_EXT_depth_bounds_test GL_EXT_direct_state_access GL_EXT_draw_buffers2 GL_EXT_draw_range_elements GL_EXT_fog_coord GL_EXT_framebuffer_blit GL_EXT_framebuffer_multisample GL_EXT_framebuffer_object GL_EXT_geometry_shader4 GL_EXT_gpu_program_parameters GL_EXT_gpu_shader4 GL_EXT_memory_object GL_EXT_memory_object_win32 GL_EXT_multi_draw_arrays GL_EXT_packed_depth_stencil GL_EXT_packed_float GL_EXT_packed_pixels GL_EXT_polygon_offset_clamp GL_EXT_rescale_normal GL_EXT_secondary_color GL_EXT_semaphore GL_EXT_semaphore_win32 GL_EXT_separate_specular_color GL_EXT_shader_framebuffer_fetch GL_EXT_shader_integer_mix GL_EXT_shadow_funcs GL_EXT_stencil_two_side GL_EXT_stencil_wrap GL_EXT_texture3D GL_EXT_texture_array GL_EXT_texture_compression_s3tc GL_EXT_texture_edge_clamp GL_EXT_texture_env_add GL_EXT_texture_env_combine GL_EXT_texture_filter_anisotropic GL_EXT_texture_integer GL_EXT_texture_lod_bias GL_EXT_texture_rectangle GL_EXT_texture_sRGB GL_EXT_texture_sRGB_decode GL_EXT_texture_shared_exponent GL_EXT_texture_snorm GL_EXT_texture_storage GL_EXT_texture_swizzle GL_EXT_timer_query GL_EXT_transform_feedback GL_IBM_texture_mirrored_repeat GL_INTEL_conservative_rasterization GL_INTEL_fragment_shader_ordering GL_INTEL_framebuffer_CMAA GL_INTEL_map_texture GL_INTEL_multi_rate_fragment_shader GL_INTEL_performance_query GL_KHR_blend_equation_advanced GL_KHR_blend_equation_advanced_coherent GL_KHR_context_flush_control GL_KHR_debug GL_KHR_no_error GL_KHR_shader_subgroup GL_KHR_shader_subgroup_arithmetic GL_KHR_shader_subgroup_ballot GL_KHR_shader_subgroup_basic GL_KHR_shader_subgroup_clustered GL_KHR_shader_subgroup_quad GL_KHR_shader_subgroup_shuffle GL_KHR_shader_subgroup_shuffle_relative GL_KHR_shader_subgroup_vote GL_KHR_texture_compression_astc_ldr GL_NV_blend_square GL_NV_conditional_render GL_NV_primitive_restart GL_NV_texgen_reflection GL_OVR_multiview GL_SGIS_generate_mipmap GL_SGIS_texture_edge_clamp GL_SGIS_texture_lod GL_SUN_multi_draw_arrays GL_WIN_swap_hint WGL_EXT_swap_control
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 25, 2024 May 25, 2024

Your computer is using an intregrated Intel UHD 730 GPU, which isn't a good option for Lightroom Classic as the performance it delivers isn't particularly good.

 

I also note below in your System Info 

 

Library Path: C:\Users\mintz\OneDrive\Pictures\Lightroom\Lightroom Catalog-v13-3.lrcat

 

Placing the Lightroom Classic catalog folder anywhere within the OneDrive path can be problematic in terms of perfomance and stability. Someone more familiar with Windows should be able to suggest a better location for the Lightroom catalog folder.

 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 25, 2024 May 25, 2024

So according to what you are saying, should I add a GPU or it is better to upgrade the CPU?

In general, is my CPU IS good enough for lightroom?

Regarding the "onedrive" path, it is not connected really to microsoft onedrive in any way, onedrive is not synced to my pc at all.

For some reason, in windows 11, all basic directories are appearing lime that, evev the "desktop"

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 25, 2024 May 25, 2024
quote

In general, is my CPU IS good enough for lightroom?


By @j6ck3lo61949312

 

Short answer - no.

 

You can't "upgrade" an integrated GPU. You should consider a recent Nvidia RTX 3000 or 4000-series. The sweet spot in terms of the price to performance ratio seems to be the 3060 or the newer 4060.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 25, 2024 May 25, 2024

What I meant by upgrading the CPU is replacing it of course.

What of the two options will improve my workflow more:

Adding a GPU (with the current i5 cpu) or replacing the CPU to i7 series (with no gpu)?

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 25, 2024 May 25, 2024

Well, then you're basically talking about a new computer. While it's technically possible to replace a CPU from a motherboard, it's a bit tricky and usually not worthwhile.

 

The i5 should probably be fine. If planning  a new computer I'd go for an i7, but the difference between them isn't dramatic.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 25, 2024 May 25, 2024

Thats exactly the sad thing about this whole thing.. it is a new computer, baught it two months ago. This is the reason for my question.

According to the specs I chose, I expected lightroom to "fly", which isnt the case right now.

At this point and because its a brand new pc, Im looking for the reasonable upgrade to lightroom work speed hopefuly not to be dissapointed again..

Either CPU replacment or adding a GPU to current pc.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 25, 2024 May 25, 2024

I thought that both @D Fosse and myself were pretty clear. The GPU is the component you need to upgrade for improved performance. Changing the CPU won't improve performance of LrC by much, if any, as your GPU is the problem component.

 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 25, 2024 May 25, 2024

Right. There was a little GPU/CPU misreading here on my part. Sorry about that.

 

So yes, Ian is right. Don't worry about the i5. It'll do fine.

 

Look into an RTX 3060/4060 or similar. It's easily retrofitted into the PCI-e slot on your motherboard. Plug your monitor into it, install the driver, and you're good to go.

 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 25, 2024 May 25, 2024

OK

Thank you both very much

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 25, 2024 May 25, 2024

Actually, there is one thing you should check before purchasing an RTX 3000/4000-series GPU.

 

These GPUs require additional 12V power directly from your computer's power supply. It's a separate cable from the power supply with an 8-pin connector clearly marked "PCI-e". All standard power supplies on the market now should have this power outlet - but just check to be sure. Without it the card won't run.

 

The connector may be split into two parts, but put together it only fits one way, so you can't get it wrong.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 25, 2024 May 25, 2024

Will check

Thank you

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 25, 2024 May 25, 2024
quote

What I meant by upgrading the CPU is replacing it of course.

What of the two options will improve my workflow more:

Adding a GPU (with the current i5 cpu) or replacing the CPU to i7 series (with no gpu)?

By @j6ck3lo61949312

 

The answer to this requires an understanding of what hardware accelerates what features, in Lightroom Classic. Some features only use the CPU, some features are GPU-accelerated. Any GPU-accelerated features run much faster on the GPU than the CPU. 

 

For your computer, the weakest link is the GPU, so the bottleneck is that the integrated graphics can only provide limited GPU acceleration. There is no CPU upgrade that can achieve equivalent performance to GPU acceleration, so the better option is to add a good GPU.

 

You said in your original question that “many sites says it won't help at all since all lightroom work is on the cpu.” I wonder how old that information is. It was good advice in 2015 or so, but around 6 or 7 years ago Adobe started adding GPU acceleration to more features, and they are still expanding that. So any good Lightroom advice today will provide at least a basic list of which features are accelerated by a good GPU.

 

Understanding components beyond the CPU is only getting more important. In the beginning we only worried about having enough CPU cores and storage speed. Now we spec Lightroom computers by balancing CPU, GPU, and storage speed relative to the application’s capabilities, with the GPU becoming increasingly important as Adobe expands the features that are GPU-accelerated.

 

In the next few years, that system balancing will have to include the new NPU (neural processing unit) that is starting to appear on Windows PCs. An NPU accelerates AI features, and Adobe apps are using more AI features every year. But you probably don’t need to worry about that for your current build.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 25, 2024 May 25, 2024

Thank you for your detailed answer

Defenitly going to check on a good GPU to add to my system

Do you think RTX 2000 series will be sufficient?

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 26, 2024 May 26, 2024

I think I would go straight to the 3000 series, even if you're looking to cut the budget. The 2000 series is the first generation RTX, but still not a lot to save on the price.

 

The RTX 3060 performs very well with Lightroom and Photoshop. It is still in stock and sells for around $300 at B+H. https://www.bhphotovideo.com/c/buy/rtx-3060/ci/50665 You simply get more value for your money. A 4060 isn't much more.

 

 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 26, 2024 May 26, 2024
LATEST

OK

Thanks for all the help

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines