Link in Zwischenablage kopieren
Kopiert
I originally wanted to post about the skow speed for LR's new DeNoise however, on analysis, I see that my image size is VERY large. Z7 images when imported to LR are showing as an 85 mb file size. Any thoughts on why LR is "boosting" the size on import?
I read online:
In its default settings, the Nikon Z7’s maximum resolution output is 8256 x 5504. Each image, depending on the subject, camera settings, and so forth, is in the neighborhood of 17-31MB.
After opening the image in a program like Photoshop or Lightroom, the file size measures a whopping 130MB.
This has turned into a FAQ that Adobe should document.
Raw files contain a single greyscale channel, 16 bits per pixel, representing the color filter array data from the camera sensor. Adobe calls that channel the "mosaic data".
The Enhance command generates a DNG in "linear raw" format, similar to a TIFF or PSD, in which there are three channels, red, green, and blue, each channel 16 bits per pixel. The DNG also includes the original mosaic data, 16 bits per pixel.
So Enhance DNGs will be rou
...Link in Zwischenablage kopieren
Kopiert
This has turned into a FAQ that Adobe should document.
Raw files contain a single greyscale channel, 16 bits per pixel, representing the color filter array data from the camera sensor. Adobe calls that channel the "mosaic data".
The Enhance command generates a DNG in "linear raw" format, similar to a TIFF or PSD, in which there are three channels, red, green, and blue, each channel 16 bits per pixel. The DNG also includes the original mosaic data, 16 bits per pixel.
So Enhance DNGs will be roughly four times the size of the original raw.
Link in Zwischenablage kopieren
Kopiert
The added issue is that Adobe's new DeNoise is essentially useless with Z7 raw files as processing time exceeds 16 minutes per image.
Link in Zwischenablage kopieren
Kopiert
That processing time is highly dependent on the graphics card, which LR uses to execute the denoise AI models. Newer graphics cards, including less expensive ones, can go much much faster, e.g. 10 - 30 seconds. My brand new Apple Silicon M2 Max Macbook Pro takes about 10 seconds.
Link in Zwischenablage kopieren
Kopiert
But with an 85MB file?
Link in Zwischenablage kopieren
Kopiert
The processing time is dependent almost entirely on the speed of the graphics hardware (GPU). Even inexpensive external disks can write data at 100 MB per second.
Link in Zwischenablage kopieren
Kopiert
Bob, this has all been covered multiple times in other threads - maybe search the forum before starting up again on the subject?
Link in Zwischenablage kopieren
Kopiert
Sorry Keith and thanks for the response.
Link in Zwischenablage kopieren
Kopiert
I shoot a Z7 mainly and files process in 15 to 20 seconds on my machine which is a previous generation M1 Max. On my 2018 work-supplied i5 intel Mac Book Pro with a extremely weak embedded intel GPU (actually has less than 2 GB memory but still does the denoise!), the same images take 2 minutes.
Link in Zwischenablage kopieren
Kopiert
John is correct. With a newish GPU from the last two to three years or so, your 45 megapixel Z7 files should denoise in about 20 seconds.
Since you're on Windows, I would highly recommend an RTX 3060. This is a moderately priced midrange card, but handles Denoise very well. It seems to hit the sweet spot for price/performance.
Link in Zwischenablage kopieren
Kopiert
Since you're on Windows, I would highly recommend an RTX 3060. This is a moderately priced midrange card, but handles Denoise very well. It seems to hit the sweet spot for price/performance.
Good to hear, as I'm strongly considering upgrading my GPU to make DeNoise go faster. There are several different varieties of the RTX 3060, 8GB or 12GB, and other variations as well. Would you recommend the 8GB or th 12GB? Since the 12GB seems to be only about $20 extra, I'm leaning towards the 12GB model.
Link in Zwischenablage kopieren
Kopiert
When I bought mine, it was 12GB all around. I didn't even know 8 was offered by some vendors. I suppose this is all down to how the individual vendors configure their cards.
But with that negligible price difference, I believe this is what they call a "no-brainer" 😉 Go 12.
Link in Zwischenablage kopieren
Kopiert
I am running currently a GTX 1050 and my question is, will the upgrade to the RTX 3060 be a giant upgrade in performance?
Link in Zwischenablage kopieren
Kopiert
Well, if Denoise is the measuring stick, whatever time it takes now above ca 20 seconds - that's your performance upgrade.
But I suspect Denoise is just the beginning. More and more functions will be moved to the GPU, relying on new GPU technology, and sooner or later you need to jump on the bus.
Link in Zwischenablage kopieren
Kopiert
@D Fosse said
But I suspect Denoise is just the beginning. More and more functions will be moved to the GPU, relying on new GPU technology, and sooner or later you need to jump on the bus.
<applause>
I hope everyone reads this and is aware of the implications about hardware needed.
Link in Zwischenablage kopieren
Kopiert
Yesterday before my new nVidia RTX 3079TI card arrived, I did a carefully timed test on a 10,000 ISO image with lots of noise. The rig is a I7-8700K, 64GB ram and Win 10 pro. I had the noise reduction setting at 59. The first test was with the GTX1660S in the computer, the second is with the new RTX 3070TI:
1660S - Expected time 1 min, Actual time 1:10
3070TI - Expected time 10 sec, Actual time 11sec.
I had previously run the test on my back up rig (I7-3930, 32GB, GTX 970GPU) and the time was more than 15 minutes.
The exact same image was used for the comparison.
The 3070TI times are very similar to what I'm getting on the same image on my MBP 16" 64GB M1 Max that I principally use for travel. The image file was from my Z9, same size in pixels as Z7.
Link in Zwischenablage kopieren
Kopiert
That's almost twice as fast as my 3060 (for the same MP count) - so now we're waiting for numbers from folks with 3080s and 3090s 😉
But seriously, this is of course not a race about seconds. What it again shows, however, is that there is a whole paradigm shift between "old" and "new" GPUs. This draws on technology that came on the market a couple of years ago, and older GPUs are left behind. Your timings for the 1660 correspond very closely with what I get on my other machine with a Quadro P2200 from about the same generation.
So the best advice we can give is get a new GPU if it's older than, say, three years or so.
Link in Zwischenablage kopieren
Kopiert
Yes of course a much more powerful GPU will result in an improvement in performance. DeNoise is the most computationally intensive task in Lightroom Classic, and relies heavily (entirely?) on the GPU. The benchmarks for these two graphics cards (not sure if these benchmarks have a relationship to DeNoise, but ...) the GTX 1050 benchmark is 5138 and the RTX 3060 is 17204 (both according to videocardbenchmark.net).
What you asked was a GIANT upgrade in performance and that's too vague to give an answer to.
Link in Zwischenablage kopieren
Kopiert
OK...I am convinced that a new card is in order.
Link in Zwischenablage kopieren
Kopiert
In addition to the very correct answer about size (The Enhance command generates a DNG in "linear raw**" format, similar to a TIFF or PSD, in which there are three channels, red, green, and blue, each channel 16 bits per pixel), the original raw is then converted to lossy DNG and also embedded into that new DNG. Of course, this new, bigger size takes place after Denoise, so the 'size' isn't, per se, the reason it is so slow for you; the main reason is, you have an underpowered GPU!
On my 2022 Macbook Pro* an original 24MP raw takes a mere 12 seconds, start to finish (which again, includes stuffing the original into the new DNG and writing it to disk).
*MBP Pro 16" 2022 32-Core GPU | 16-Core Neural Engine/64GB
** See: http://www.barrypearson.co.uk/articles/dng/linear.htm
Link in Zwischenablage kopieren
Kopiert
Any thoughts on why LR is "boosting" the size on import?
I read online:
“…17-31MB. After opening the image in a program like Photoshop or Lightroom, the file size measures a whopping 130MB…”
By @Heirloom Bob
To add to John’s correct answer, you should know that this file size increase from raw to RGB is nothing new, and nothing even to do with Adobe software. This happens in any software that converts raw to RGB, from any developer, because of pure math.
The “larger” file size is actually the real, natural file size for an (uncompressed) RGB image that you can easily share, use in other applications, etc. That 130MB file is what you get after multiplying number of pixels * number of bits per pixel * number of channels. And for example if the software converts from raw to CMYK (which Camera Raw can do), the expansion to four channels instead of three increases the file size that much more.
Raw files are small, but that’s because their not-yet-interpreted single channel raw data stream is not recognized as a usable image by most other applications (can’t use raw in most software like a video editor, Adobe Illustrator or InDesign, can’t display on a web page, etc.).
Weitere Inspirationen, Events und Ressourcen finden Sie in der neuen Adobe Community
Jetzt ansehen