Copy link to clipboard
Copied
Ok, one issue sort of solved the wait time for the previews. But that lens blur oddity. Ugggh
I would recommend two things
1. If the Lens Blur works (but fouls up the GPU use) then use it sparingly, when you can put up with the annoyance of having to reset the use of GPU.
2. Create a new post in Discussions.
Not in Bugs, let Adobe decide to move it to Bugs if they get involved and if they decide a bug is occurring. A Discussion
This Discussion would be On the subject "Using Lens Blur Disables op
...Copy link to clipboard
Copied
Busy Loading/updating a preview (I think)
Please post your System Information as Lightroom Classic (LrC) reports it. In LrC click on Help, then System Info, then Copy. Paste that information into a reply. Please present all information from first line down to and including Plug-in Info. Info after Plug-in info can be cut out as that is just so much dead space to us non-Techs and it takes up vast amounts of scroll space making the reply less readable and less likely that others will bother with your post.
Copy link to clipboard
Copied
I am using a Sony a7r5 which has 60mb files. Maybe it is taking LR longer than usual to read the files?
Also the new Lens Blur is not working. I get "Something went wrong." Thanks
Lightroom Classic version: 13.1 [ 202312111226-41a494e8 ]
License: Creative Cloud
Language setting: en
Operating system: Windows 11 - Home Premium Edition
Version: 11.0.22631
Application architecture: x64
System architecture: x64
Logical processor count: 8
Processor speed: 2.3GHz
SqLite Version: 3.36.0
CPU Utilisation: 2.0%
Built-in memory: 16133.8 MB
Dedicated GPU memory used by Lightroom: 2302.9MB / 1982.4MB (116%)
Real memory available to Lightroom: 16133.8 MB
Real memory used by Lightroom: 6557.5 MB (40.6%)
Virtual memory used by Lightroom: 7633.1 MB
GDI objects count: 987
USER objects count: 3055
Process handles count: 5883
Memory cache size: 807.3MB
Internal Camera Raw version: 16.1 [ 1728 ]
Maximum thread count used by Camera Raw: 5
Camera Raw SIMD optimization: SSE2,AVX,AVX2
Camera Raw virtual memory: 2097MB / 8066MB (26%)
Camera Raw real memory: 1053MB / 16133MB (6%)
System DPI setting: 120 DPI
Desktop composition enabled: Yes
Standard Preview Size: 1920 pixels
Displays: 1) 1920x1080
Input types: Multitouch: Yes, Integrated touch: Yes, Integrated pen: No, External touch: No, External pen: No, Keyboard: Yes
Graphics Processor Info:
DirectX: NVIDIA GeForce MX250 (31.0.15.4633)
Init State: GPU disconnected
User Preference: Auto
Application folder: C:\Program Files\Adobe\Adobe Lightroom Classic
Library Path: D:\MASTER M-Drive Photos\Lightroom\Master Catalog-2-v13.lrcat
Settings Folder: C:\Users\jm345\AppData\Roaming\Adobe\Lightroom
Installed Plugins:
1) AdobeStock
Config.lua flags: None
Copy link to clipboard
Copied
Graphics Processor Info:
DirectX: NVIDIA GeForce MX250 (31.0.15.4633)
Init State: GPU disconnected
User Preference: Auto
Also the new Lens Blur is not working. I get "Something went wrong." Thanks
Usually that error message indicates an issue with the GPU, specifically with the driver. Usually it would be recommended that you update the GPU driver to the latest. But at NVIDIA the driver you have is the latest. It is not a Studio driver, but the more common Game Ready driver. Usually the Studio driver is recommended for creative apps like LrC. But NVIDIA does not appear to offer that for your GPU.
It looks like in /preferences/performance/ that you have the use graphics processor option set to Auto, But it also looks like the Initial state is off. I suspect LrC is not finding your GPU of being of much help. You might want to post that preference tab in a reply.
You might try turning that feature Off and seeing what occurs. Mind you much AI might not work.
Init State: GPU disconnected
Ah, I suspect that when you reply back with a posting of what you have, what you see in /preferences/performance, that an error will be displayed, something like in top of:
https://helpx.adobe.com/lightroom-classic/kb/troubleshoot-gpu.html
Dedicated GPU memory used by Lightroom: 2302.9MB / 1982.4MB (116%)
Only 2 GB of VRAM, at bare minimum in LrC System Requirements, barely sufficient for image processing, could be sufficient for export use of GPU (but apparently not in your case), insufficient for full acceleration, some can accomplish AI masks, some cannot with just 2 GB
https://helpx.adobe.com/lightroom-classic/system-requirements.html
https://helpx.adobe.com/lightroom-classic/kb/lightroom-gpu-faq.html
Dedicated GPU memory used by Lightroom: 2302.9MB / 1982.4MB (116%)
Now that is disconcerting. Wondering about that one. Any other members? Could this indicate a lack of CACHE?
For the author, was this system info gathered while in develop module, and while the previews were loading?
Library Path: D:\MASTER M-Drive Photos\Lightroom\Master Catalog-2-v13.lrcat
How much free space in percent on the D drive? Typically LrC needs at least 20% free, some say at least 25% free.
I am using a Sony a7r5 which has 60mb files. Maybe it is taking LR longer than usual to read the files
Maybe, but only initially. I do not think the size of the RAW files will affect the size of the Library Previews, oh perhaps the Develop previews, but a separate question below. I think the Library preview size is more dependant upon monitor size if you base the standard previews upon that.
Followup inquiry, your setting in /preferences/performance/ for the Camera RAW CACH, what have you set that one for? And on what hard drive?
Copy link to clipboard
Copied
Copy link to clipboard
Copied
So, yes that GPU is not up to snuff.
Your Camera RAW CACHE is at the default. The default value of 5GB is pathetic. (Blame Adobe) Try increasing that to at least 20GB. The Camera RAW CACHE will affect develop module edits.
Increasing might, but probably will not change how LrC treats that GPU (upon restart after increasing) but I would not count on that.
It is also on your C drive, the same drive as the Windows Paging file (this is a Windows OS issue, not MACOS) as such the two CACHE file compete for read/wrights. If at all possible, consider placing it on a different hard drive, perhaps D drive. Note that the LrC catalog and the Camera RAW CACHE can take advantage of being on your fastest hard drive, but if the fastest hard drive is getting full, then move them, and for the Camera RAW CACHE even if the C drive is faster than the D drive, consider moving it off C.
Copy link to clipboard
Copied
One followup, that NVIDIA GPU. I suspect that you have an NVIDIA app called GeForce Experience. If you do, and if it is up to date, then you might find when running it that some application specific options are presented, these to set that GPU up in specific ways as to improve performance tailored to those apps. Mostly games, but LrC should be included. Is it? And is it selected when you look at it. Fixing that can change how LrC treats the GPU.
Copy link to clipboard
Copied
I have 266gb free on my C: drive. I increased the cache to 20gb. I then used Geforce Experience to optimize for LR. I launch LR and the attached file shows the active settings. But when I click on Lens Blur it starts analyzing and then the usual error message pops up. When I go back into LR Pref Performance it has turned the Graphics Processor off.
Copy link to clipboard
Copied
Copy link to clipboard
Copied
1. Do you still have the original loading previews issue?
2. 266gb free on my C: drive , ahh, but the catalog is on your D drive, correct? How much free space in % on the D drive?
3. In /preferences/performance, where you have the GPU in Custom, can you select Auto? What happens? And then again what happens upon restart of LrC?
4. What type of drive is your D drive? Internal/External, Spinning disk/ Solid state? How large?
Copy link to clipboard
Copied
1. Do you still have the original loading previews issue? No the preview loading is very quick when the GP is active. But after running Lens Blur it deactivates the GP and preview loading is much longer.
2. 266gb free on my C: drive , ahh, but the catalog is on your D drive, correct? How much free space in % on the D drive? LR catalog is on D drive with 1.15 TB free space.
3. In /preferences/performance, where you have the GPU in Custom, can you select Auto? What happens? And then again what happens upon restart of LrC? I can select Auto but I get the same error message with Lens Blur and it deactivates the GPU. Restarting LR the GPU is set to Auto and loading previews is fine. But again, running Lens Blur deactivates the GPU. I never had any issues with LR before with this computer before trying Lens Blur - which has been unsuccesful so far - and it causes the issue of disabiling the GPU if I try it.
4. What type of drive is your D drive? Internal/External, Spinning disk/ Solid state? How large? D drive is SSD - attached screenshot shows its properties. I also have a catalog on my computer's C: drive and I have the same issues running Lens Blur. Thanks.
Copy link to clipboard
Copied
Ok, one issue sort of solved the wait time for the previews. But that lens blur oddity. Ugggh
I would recommend two things
1. If the Lens Blur works (but fouls up the GPU use) then use it sparingly, when you can put up with the annoyance of having to reset the use of GPU.
2. Create a new post in Discussions.
Not in Bugs, let Adobe decide to move it to Bugs if they get involved and if they decide a bug is occurring. A Discussion
This Discussion would be On the subject "Using Lens Blur Disables option to use Graphics Processor" (or something like that)
Describe the steps you take, something like (correct any thing wrong lke Custom not Auto , drive D not C, etc)
1. Initially have the option in /preferences/performance/Use Graphics Processor/ to Auto.
(insert a screen capture of that here)
2. Restart LrC just in case
3. Inspect the option /preferences/performance/Use Graphics Processor/ and I find it is still in Auto
4. Working with Develop Module on images works Ok with the exception of Lens Blur
5. When I try using Lens Blur, LrC disables the option /preferences/performance/Use Graphics Processor
(insert a screen capture of that here)
6. Is their more? If so add to the issue
7. My System Info BEFORE using Lens Blur is as follows:
(insert that info as you have done before)
8. That part of my system Information showing my My Graphics Processor Info After trying to use Lens Blur is as follows:
(insert just that bit of info, the latest version of that, something like:
Graphics Processor Info:
DirectX: NVIDIA GeForce MX250 (31.0.15.4633)
Init State: GPU disconnected
User Preference: Auto)
9. My catalog is on a SSD with 31 % Free space
10. My Camera RAW CACHE is set for 20 GB. it is on drive C
11. As you can see my GPU drive is v546.33 and per NVIDIA that is the latest as of 1 January 2024. I have only 2GB of VRAM
12. My Camera is a Sony A7R5 which has 60 MB RAW files.
Inquiry. Why is LrC disabling the use Graphics Processor? Is their another discussion or perhaps a Bug posting on this. Is their a ADOBE document on this. Help.
Copy link to clipboard
Copied
Thank you!! Lens Blur has never worked so there must be a glitch somewhere with that feature and my Dell computer. Otherwise LR has always run fine on that computer. As you suggest I will start a new discussion, using your outline. Thanks again for your help. Happy New Year.
Copy link to clipboard
Copied
Lens Blur is still very new, and treated as a feature preview, or more specifically Early Access (you will actually see that on the tool) so probably lots of issues.
hmm, I overlooked that last bit in the Lens Blur Panel, you might want to on it and leave feedback. You might want to do that after creating your new discussion as you could then include a link to that in the feedback.click
Copy link to clipboard
Copied
Good idea. Thanks!
Copy link to clipboard
Copied
I started a new discussion here:
The Share Feedback link in LrC just goes to this discussion.