Skip to main content
anthonynowack
Participating Frequently
April 7, 2023

P: (Windows - A750 & A770 GPU) New Denoise turning images black and white

  • April 7, 2023
  • 94 replies
  • 27949 views

Using the new AI Denoise feature seems to darken my images and strip almost all the color out of them. Files are from a Nikon D500 being processed by a Ryzen 5 3600, 32GB RAM, and Intel A750 video card on Windows 11. Tried to create virtual copy of file and reset all other adjustments and it makes no difference. Still dark and almost completely black and white.

94 replies

Inspiring
May 10, 2023
Not sure how to remove it. I would like to. I didn't realize that my email
suffix was added to the post.
TheDigitalDog
Inspiring
May 10, 2023
quote

The workaround seems a poor idea. I'll continue to use Topaz Ai Photo
until LRC is fixed which works well and quickly with no modification to my
drivers.

A really poor idea is exposing your phone and email address to the web spiders. You might consider editing your post.

Author “Color Management for Photographers" & "Photoshop CC Color Management/pluralsight"
Inspiring
May 10, 2023

The workaround seems a poor idea. I'll continue to use Topaz Ai Photo
until LRC is fixed which works well and quickly with no modification to my
drivers.

--
Patrick Garrett

Rick O
Participant
May 10, 2023

Thanks for the response - the screenshot really helped. You were right - it was easy! Got Denoise working.

Legend
May 10, 2023

Disabling the ARC GPU won't be a good idea for those who do not have an alternative integrated GPU on their mainboard.

 

For those who do have an alternative GPU, disabling the ARC GPU in Device Manager may not be necessary. You can use Windows to set the GPU that LrC only will use:

 

1. Open Settings > System > Display.

2. Click Graphics settings.

3. Under Graphics performance preference, ensure Desktop app is selected in the Choose an app to set preference selector, then click Browse.

4. Navigate to C:\Program Files\Adobe\Adobe Lightroom Classic, select Lightroom.exe and click Add.
5. A panel will display for Adobe Lightroom Classic with buttons Options and Remove.
6. Click Options and choose the integrated GPU, which is likely to be the Power saving option and Save.
7. Close settings. No reboot needed.

 

Now LrC will use your integrated GPU while everything else will use the ARC GPU.

 

When this issue is fixed, go back to the Graphics performance preference and Remove the Adobe Lightroom Classic preference.

 

My only concern is that LrC has some very serious problems using Intel Integrated GPUs - my own always fails.

Participating Frequently
May 9, 2023

Hi Rick,

Here is a screen shot of how I disabled the ARC GPU on my machine. It is actually very easy, just click "disable device" and reboot the system. I do totally understand your reluctance to modify settings, for me I needed to finish a project and this got me back up and running.

If you do this, my understanding is that the ARC GPU will no longer be active. You will be using the video capabilities built into you system, for me this is the Intel Iris Xe integrated graphics controller.

For lightroom, I am actually seeing the same, or slightly increased performance by turning off the ARC GPU. I think this is because the integrated graphics controller is fairly decent and working properly with LR. The more powerful ARC GPU is not functioning properly so it is not accelerating anything that I can see in LR, just getting in the way. I have not tired it in any other Adobe programs, so I cant speak to how it works or not with those.

As far as the other programs that are GPU intensive (video editing, gaming ...) and it is working properly, I am sure it is a big hit in performance. For everyday computing the integrated graphics controller is all most people have, so it should be fine.

Also, I am not sure if the system reset is actually needed after disabling/ enabling the driver, just a habit I have from a different century:)

I hope this is of some help.

Joe

 

 

 

Rick O
Participant
May 9, 2023

Thanks for the workaround suggestion, but I'm reluctant to muck around in Device Manager. What will disabling the GPU driver do to overall LR performance? Or the rest of my laptop for that matter. I suspect it won't be good. Also, disabling the driver in Device Manager is not straightforward. Is the procedure: Device Manager > System Devices > Intel Graphics System Controller Firmware Interface > Driver tab > disable. Then reboot? What about the Intel Graphics System Controller Auxillary Firmware Interface? So I do this, use Denoise, then go back to Device Manager and reenable the driver, reboot, etc.? Hello Adobe, do you have enough info to fix this now? 

Participating Frequently
May 9, 2023

Hi All, 

I have a poor but workable way to get Denoise running and giving amazing looking, but somewhat slow results. This is a stop gap method that I used to get my system working much better until the drivers get updated, it may or may not help with yours... try at your own risk...

On my i7 with the Ark A370m GPU I was able to get proper colors by disabling the driver for the A370m in the device manager. I also turned the GPU to "AUTO" in the performance tab of preferences in Lightroom. Interestingly, the GPU is still listed and you can still see it taking load when processing pictures. 

After a system restart Denoise is now working properly and giving a ~3 minutes processing time estimate/ picture, it was 30+ minutes with the GPU driver enabled (and the output was all BLUE!). 

Export processing times are also not to bad, ~30 seconds to output 50 edited pictures at 100% jpg quality. This is similar to times with the driver enabled.

I hope this is helpful and I really hope we have proper drivers very soon!

Lightroom ROCKS!

Participant
May 5, 2023

 

Participating Frequently
May 5, 2023

Same issue with Denoise here:(

 

Windows 11 10.0.22621 Build 22621

i7-12700H, 2300 Mhz, 14 Core(s), 20 Logical Processor(s)

Arc(TM) A370M Graphics, Driver Version 31.0.101.4335

Thank you for sorting this out for us!