Exposure & Brightness on final photo edit.

New Here ,
Sep 16, 2022 Sep 16, 2022

Copy link to clipboard

Copied

Hi there, I have a question for you which I hope someone will be able to answer!


When I’m editing an image in Lightroom I turn up my monitor’s brightness control to full (or nearly full) brightness (MacBook Pro) so as I can see & adjust the shadows etc.

I generally do my photo editing in a room with ‘average’ lighting conditions i.e. not near a sunny window nor at night with hardly any ambient light.

For simplicity’s sake let’s leave color balance & HSL out of the equation and concentrate purely on brightness, contrast and exposure.

I adjust the brightness, contrast and exposure of my image so that it looks correct on my MacBook screen.

So here’s s my question:

The image brightness & exposure levels looks perfect on my screen and the image looks great.

If I were to save that image as a high quality jpeg and then give that jpeg image to my sister for example, because she’ll be viewing that image on a different monitor, it may appear too dark or too bright. And if she decides to have it printed It’s anyone’s guess how the print will turn out!

If you were editing an image, what brightness level would you set your monitor to so as to represent the ’true’ exposure level you have applied to the image?

Any suggestions you can provide would be really appreciated!

P.S. I regularly calibrate my monitor using Datacolor SpyderX Pro.

TOPICS
macOS

Views

110

Likes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 1 Correct answer

Community Expert , Sep 17, 2022 Sep 17, 2022

Luckily, there is a visual reference we all share: a white sheet of paper.

 

You should calibrate your monitor so that monitor white is a visual match to paper white. For most "normal" conditions, that corresponds to a monitor white point at approximately 100 - 120 cd/m², as Per indicates.

 

This takes into consideration ambient light and general working environment, as well as intended print viewing light. That's what makes it a good common reference. You can make it as generic or specific as y

...

Likes

Translate

Translate
Community Expert ,
Sep 17, 2022 Sep 17, 2022

Copy link to clipboard

Copied

When you calibrate your monitor, you should set brightness to 100 - 120 candela per square meter in the calibration software.

Then leave the monitor alone, don't change brightness, contrast, or anything else! Doing so will invalidate the calibration, and defeats the purpose of calibrating.

 

Turning up the brightness when editing will make the image look much brighter than it actually is, and if you send it to someone else, it will most likely appear very dark on their monitor, especially if the monitor is calibrated.

It's always anyone's guess how an image will look on other people's monitors, but your workflow makes it much more likely that it will appear too dark.

 

If your images are too dark, you have to make them brighter by editing, not by changing monitor brightness.

Also make sure that you're not underexposing in the camera. Underexposure will lead to lost shadow detail and increased noise when you brighten the image during editing.

See Exposing a digital image

 

Likes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Sep 17, 2022 Sep 17, 2022

Copy link to clipboard

Copied

In addition to Per's very good and extensive answer: You would also make sure that any feature of the monitor that offers some kind of auto/dynamic adaption to surrounding brightness or dynamic contrast or alike is turned off. That often gets forgotten and as a result, even on a calibrated monitor, the results are not as expected.

Likes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Sep 17, 2022 Sep 17, 2022

Copy link to clipboard

Copied

Luckily, there is a visual reference we all share: a white sheet of paper.

 

You should calibrate your monitor so that monitor white is a visual match to paper white. For most "normal" conditions, that corresponds to a monitor white point at approximately 100 - 120 cd/m², as Per indicates.

 

This takes into consideration ambient light and general working environment, as well as intended print viewing light. That's what makes it a good common reference. You can make it as generic or specific as you like. You can have different calibration targets for different types of paper.

 

It's a very common mistake to crank the monitor up too bright, and most monitors out of the box are much too bright. There might be a case for that for video/HDR, but for photographic use you can't go wrong by using paper white as visual reference. That means somewhere around 100 - 120 cd/m².

 

 

Likes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Sep 24, 2022 Sep 24, 2022

Copy link to clipboard

Copied

LATEST

Thank you to everyone who replied to my post. I'm trying to keep thigs as uncomplucated as possible. The visual match with white paper has worked well for me. I have also taken on board the other helpful suggestions in this post.

Thank you all.

Likes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Sep 17, 2022 Sep 17, 2022

Copy link to clipboard

Copied

First, there is nothing in any Adobe raw converter that shows you anything about actual exposure which only takes place at capture. That requires a raw Histogram ala RawDigger (see https://www.rawdigger.com).

Your current Histogram shows you info about the brightness of the image based on the current rendering edits from that raw data which may look fine, too dark, too light etc. Based of course visually on how you calibrate and profile that display. 

Now the sticky part. You cannot control how others see your images! You can't control if they have calibrated their displays or not, if they are anything like your display, if they are using color management or not, etc. 

You can attempt a decent overall compromise of ideal settings. This is a good article to start with on setting up a digital darkroom for that task:

http://lumita.com/site_media/work/whitepapers/files/calibrating_digital_darkroom.pdf

The Histogram isn't a bad way to at least view the current renderings with a few caveats* in deciphering what it shows:

 

Everything you thought you wanted to know about Histograms
Another exhaustive 40-minute video examining:
What are histograms? In Photoshop, ACR, Lightroom.
Histograms: clipping color and tones, color spaces, and color gamut.
Histogram and Photoshop’s Level’s command.
Histograms don’t tell us our images are good (examples).
*Misconceptions about histograms. How they lie.
Histograms and Expose To The Right (ETTR).
Are histograms useful and if so, how?

Low rez (YouTube): http://www.youtube.com/watch?v=EjPsP4HhHhE
High rez: http://digitaldog.net/files/Histogram_Video.mov

Author “Color Management for Photographers" & "Photoshop CC Color Management/pluralsight"

Likes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines