Skip to main content
Inspiring
January 29, 2018
Question

Lightroom on MacBook Pro with 4K HDR monitor

  • January 29, 2018
  • 8 replies
  • 8799 views

I recently purchased a 2017 MacBook Pro and I am not considering monitors.  There are not many that are Thunderbolt 3/USB-C compatible.  However, the few choices not only include 4K but also 4K with HDR.  I know Lightroom and Photoshop support 4K monitors but what is the feedback for the 4K HDR monitors. Is there any benefit for HDR? Are the applications compatible with HDR? Curious if anyone out there uses a 4K HDR monitor and whether they believe its worth it.

Thanks

    8 replies

    Darcnes
    Participant
    April 24, 2018

    I think there might be a few here missing the point of HDR. It's not more brightness, though that is essentially part of what's needed to present the benefits of HDR vs SDR.

    There's really no difference in the long run, philosophically speaking, in the print vs screen debate when compared to the SDR screen vs HDR screen reality. It's effectively the same image, and SDR simply doesn't get to see all of that image. Much like high quality sound in low end headphones, you simply can't hear the nuance, even though it exists in the source.

    I can't say for sure how this plays out in the realm of practical photography, but don't forget that every single video out there is nothing but a series of still-frames, it stands to reason this has an impact here as well.

    As for what that difference is, seeing is believing is kind of the nature of the beast. Trying to show off or explain to SDR users how vastly improved an HDR image is over its lesser brethren is futile until you can see it for yourself on an HDR screen in front of your face.

    D Fosse
    Community Expert
    Community Expert
    April 24, 2018

    You can increase a monitor's dynamic range either by deepening the blacks, or by brightening the whites. If you google HDR monitors, you'll see that they all go for the latter. Some boast max brightness up to 550 cd/m². This is all fine if you work in a closed loop and no one else ever needs to see these images on standard monitors - and they never need to be printed.

    What has to be clear is that you cannot prepare an image on an HDR monitor and expect it to look perfectly normal on a standard monitor. It won't; it will look unbearably dull and muted. It's not as if you "unlock" the hidden extra quality just by looking at it on an extended dynamic range device.

    There's a reason the recommended white point luminance is in the 100-120 cd/m² range: it happens to correspond with how you see white paper in good, artificial light. That means it translates to print the way you see it on screen.

    Yes, if you're working outside in bright sunshine it might make sense to have white point up in the 550 cd/m² range. Then you'd see white paper the same way, and again they would translate nicely. But most of us are indoors.

    An image file has a certain tone response curve that corresponds to "normal" viewing conditions. It's possible that HDR displays could work if the standard TRC was also dramatically altered, thus preserving the "normal" range as it is, while adding the extra highlight capacity on top. I suppose that could be done if new standards were adopted, but at present that's not how it works.

    Darcnes
    Participant
    April 24, 2018

    You've mentioned the elevated luminance several times now as if that's the point of the technology, whereas it is but a means to an end.

    It must be taken into consideration that the entire point of HDR is to create and present a more realistic image. As several have mentioned already, the transition from paper to panel is well under way. Magazines, newspapers, et al., are considered a dead and dying medium (no tree pun intended). As such, HDR standards have been developing for years, and though it's a rather competitive space right now, the advent and progression of open standards will likely settle this space down before too long. Don't be taken aback by the (video) bit in the title there, a bit of reading will quickly reveal there's two sides to the coin, both display AND capture, and that's where photographers need to take heed.

    So yes, while there is and has definitely been a focus on a specific luminance level to recreate a realistic representation for the user, the definition of what the desired level of realism has shifted dramatically.

    As for the elevated luminance, a brighter image, a "brighter white on paper indoors" is not the point. In order to create a more realistic image, even indoors, we needed to more accurately replicate the ability for the eye to perceive multiple levels of brightness at the same time, to not have one part of the image blow out another, and thus we see standards with nearly triple range as conventional means to accommodate both the brighter highs AND darker lows, simultaneously, with many more steps in between.

    While the previous target might have been a white sheet of paper in a lit indoor room. The new target is seeing that room on a screen, as realistically as possible.

    As far as an HDR image being muted on an SDR display, seems like that's entirely up to the point of capture, and likewise how those captures are translated before hitting the display. It's just not a simple matter of fact that it's going to be washed out, we just happen to have a couple of good examples of bad implementations.

    Community Expert
    March 29, 2018

    I am pretty sure you need 10-bit color support in the apps to support HDR. Photoshop and Premiere Pro (for obvious reasons as you need to be able to edit HDR video) can do this but I am pretty sure Lightroom has no support whatsoever for this.

    D Fosse
    Community Expert
    Community Expert
    March 30, 2018

    Video path bit depth isn't the same as data bit depth.

    What the OP refers to is HD monitors, which just means extreme brightness - in excess of 500 cd/m² ! - while maintaining a reasonable black point. Nothing stops that in a standard 8 bit video path.

    The problem, of course, is that images prepared under these conditions will make no sense to anyone else.

    Participating Frequently
    March 29, 2018

    I have just found LG and Viewsonic 4K monitor for MacBook Pro with USB C Port.Viewsonic is somewhat costlier than LG

    D Fosse
    Community Expert
    Community Expert
    January 29, 2018

    Well, be my guest, if you want to work in your own private bubble. But don't try to publish your photos anywhere. They won't make any sense to anyone else - not in print, and not on anyone else's screens either.

    What this means in practice is that you will have a monitor that blasts out 500cd/m² or more. That's where the "high dynamic range" comes from. You won't exactly be on the same page as anyone else.

    Don't say you weren't warned.

    If you want to be on the cutting edge, pushing new technology forward and into the paper-less future, that's fine. But it's a very  l o n g  way to go. In the meantime, I prefer to be in the same world as everyone else.

    ejg1890Author
    Inspiring
    January 29, 2018

    D Fosse

    I dont disagree with you on the HDR monitor to print issue. I am stating there is a paradigm shift of viewing photos on a screen not in print. This is the case for many (not all) professional photographers as well.

    elie_dinur
    Participating Frequently
    January 30, 2018

    But what D Fosse has also said - and you haven't related to it - is that very few of your web viewers will be doing so on calibrated, profiled, and color managed HDR monitors. Just as only a minority of today's viewers have a Wide Gamut monitor (although they have been around for much longer) and therefore the current advise is to reduce the gamut of your web postings to sRGB. Will you upload "as is" photos that look great by you to Flickr and on my six year old monitor will I see blown whites and blocked up shadows? Being in the tech vanguard may sound attractive, nevertheless you will sometimes need to consider exactly where that state-of-the-art, "paradigm shift of viewing photos on a screen" is really at (to say nothing of the limitations of human vision until the day when we all have USB slots on our foreheads).

    D Fosse
    Community Expert
    Community Expert
    January 29, 2018

    The very concept of HDR monitors for photography is, pardon my french, ridiculous. The whole paradigm of looking at a photo is based on a printed photo on a piece of paper. There's no reason to assume that will change anytime soon - at least not until the day our brains are all online and there's no longer any need for physical media.

    Until then, what you need from a monitor is the ability to accurately preview a print. That's what the word monitor means.

    A good inkjet print on top grade glossy paper has a contrast range of about 300:1. So why do you need a contrast range of 1000:1 or 2000:1 or above in a monitor?

    Any monitor out of the box is already too bright, already with too high contrast. It's already too much HDR. The references we all share for a photograph is paper white, and maximum ink. Those are the black and white output endpoints.

    For an existing monitor to reproduce that, you usually have to dial brightness way down, and the black point (if adjustable) almost as much up. For most normal scenarios, a white point of around 120 cd/m² is about right. I just saw an HDR monitor advertise a white point of 550 cd/m²!!! Have your UV protection sunglasses ready, gentlemen...

    Gamers may have some fun with this. In the real world...well, not all technology advances serve any sensible purpose. This one certainly doesn't.

    dj_paige
    Legend
    January 29, 2018

    https://forums.adobe.com/people/D+Fosse  wrote

    The very concept of HDR monitors for photography is, pardon my french, ridiculous. The whole paradigm of looking at a photo is based on a printed photo on a piece of paper. There's no reason to assume that will change anytime soon - at least not until the day our brains are all online and there's no longer any need for physical media.

    Until then, what you need from a monitor is the ability to accurately preview a print.

    In my mind, there is a different paradigm. My paradigm is that many (nearly all) of my photos are not destined to be printed, that we view them only on a computer monitor, and so the monitor doesn't have to accurately preview a print. Could there not be monitors that are built to this paradigm?

    D Fosse
    Community Expert
    Community Expert
    January 29, 2018

    The point is that it's the only reference we all share. If we all stick to that as a reference, we have common ground and can share files and we all see the same.

    Even if you only look at your images on screen - you do want to be able to share those images with others, right?

    Hal P Anderson
    Inspiring
    January 29, 2018

    Paige is right. You definitely don't need and don't want 1000 cd/m*2 for photography. That's about an order of magnitude above what's recommended. Colour isn't likely to be accurate, either.

    Hal

    ejg1890Author
    Inspiring
    January 29, 2018

    I did do a little bit of research over lunch and it does appear several companies are or have recently released 4k HDR monitors directed towards photographers and video editors including Dell, LG, Viewsonic and BenQ. The brightest these go are 350 cd/m. (I have no idea on print vs screen) The msrp varies greatly from Dell up to BenQ.  The most interesting to investigate is the ViewSonic that includes a power deliverable USB-C. BenQ does not deliver power while Dell does not have a USB-C, but its also by far the cheapest of the 4 I found.

    Jim

    dj_paige
    Legend
    January 29, 2018

    ejg1890

    Since these monitors are new, and we're all learning, can you provide a link where it says that these new 4K HDR monitors are for photography? Because my own use of Google finds only vague and unsupported and indirect mentions of using an HDR monitor for photography, usually by some web-site reviewer, rather than the manufacturer. If these HDR monitors were truly useful in photography, I would expect to see Google find more than I found.

    Just Shoot Me
    Legend
    January 29, 2018

    You'll need to buy one of the Apple Dongles that convert from USB/C to whatever input the of the monitor is.

    ejg1890Author
    Inspiring
    January 29, 2018

    Thanks for the reply.

    Correct. Unless I get a monitor with Thunderbolt 3 or USB-C.  I just want to determine if there is a benefit to HDR for photos and if Lightroom/Photoshop supports the HDR capability. I know there are benefits for video - gaming, video editing, movies, etc. Is there a benefit for photos and does Lightroom/Photoshop support the capabilities.

    Thanis again.

    dj_paige
    Legend
    January 29, 2018

    As I understand things, HDR monitors are for gaming. I don't think they help at all with photography.