• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

Exposure difference between devices

New Here ,
Mar 16, 2021 Mar 16, 2021

Copy link to clipboard

Copied

Hello, 

Been having this problem a for ages but have put up with but it's time to sort it for me to progress. 

I edit my photos in Lightroom Classic on my laptop. When I sync them to mobile, they are dark and need re-editing. Viewed again then on the laptop and they are over exposed. When importing to Lightroom Classic, my photos automatically go darker, maybe something here to look at? But,

I have the same issue with videos. I had to keep 'guess' editing in Premiere Pro, exporting and putting onto memory stick and putting into the TV until it looked good on the TV, as it was being shown on a local TV station. The video also looked OK on my mobile after the edits for TV but bright on laptop still. Whatever I look at on both screens it's the same issue.  I know there are differences in screens. My friend has all Apple products and they are all fine of course but also they are fine on his TV's. I'm using a Dell gaming laptop and not a cheap thing. My screen brightness is only on 20, I've seen in other posts that that may be the issue so tried lowering it to check, and my colour profile is sRGB. I'm no wizard unfortunately so please bare with. 

Many thanks in advance for reading and any suggestions.

Richard 

 

 

 

Views

126

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Mar 18, 2021 Mar 18, 2021

Copy link to clipboard

Copied

For the brightness to match across devices, they all have to literally be set to the same luminance. But in the real world they never are, because you can never control other people’s computers, phones, and TVs. And that doesn’t even account for how many devices can now sense ambient light and automatically change display brightness and color to compensate, so it’s always changing.

 

What industries do is set standards so that if different people involved in the same project use the standard, at least they are viewing under the same conditions. For commercial printing, jobs are done on displays calibrated to specific luminance and color specs, and prints are evaluated in viewing booths that are also using very specific lighting. That way everybody sees them the same way for approvals, even if they’re in different cities. They agreed on a viewing standard, sometimes under contract. Of course, once sent into the world, the prints look different under all the different lighting conditions in all the newsstands and living rooms, but that’s out of their control so they don’t worry about that.

 

Same with TV. Working video pros will use a “reference monitor” tightly calibrated to a specific broadcast or production standard, which includes a specific luminance and color gamut. You edit to that, then you let it go and know it will vary on different screens. But if you are editing on a reference monitor, at least you know that you are doing it under the same viewing conditions as any other pro working according to the same standard, like the people at the TV station.

 

So the short version is to decide or find out what standard you want to (or are required to) work under, profile and if possible calibrate your production display to that, and let it go.

 

For general work (family, social media, TV), sRGB at 120 to 150 cdm^2 is an OK standard. You would verify that on your display using a USB display profiler/calibrator. You cannot verify that using the controls on most affordable displays; for example the brightness scale is not consistent across computer and mobile device displays. I say it’s an OK standard because it isn’t any industry’s specific standard, but is close enough to most general consumer uses except color-critical printing.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Mar 18, 2021 Mar 18, 2021

Copy link to clipboard

Copied

LATEST

I seem to read something between the lines here: There is a substantial difference between Lightroom on one hand, and "outside" Lightroom on the other - but on the same machine. Do I get that right? "Outside" Lightroom then probably means the Windows "Photos" app? Still right?

 

In that case it's a bad/incorrect monitor profile. The monitor profile is used by Lightroom and other color managed applications - but not Windows "Photos", which does not support color management at all. It ignores the monitor profile completely, so it will be entirely unaffected by a bad profile.

 

Dell is notorious for shipping bad monitor profiles with their laptops and monitors (along with some other manufacturers). The problem is that these profiles are distributed through Windows Update. Ideally you will use a calibrator to make your profiles, but if not, you may get these bad profiles without knowing what hit you.

 

A bad monitor profile can give all kinds of weird symptoms. Mostly it affects color, but it happens that the tone mapping is completely off too.

 

To test that, replace your current profile with sRGB IEC61966-2.1. Relaunch Lightroom when done, it loads the profile at application startup:

Displayprofile_20_3.png

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines