Exit
  • Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
  • 한국 커뮤니티
0

HDR 203 vs HDR 300

New Here ,
Oct 23, 2025 Oct 23, 2025

I want to make the most of my HDR workflow and want some guidance on choosing the right color space for my projects.  My hdr workflow is Sony ZV-E1 using XAVC HS 4K @ 100M 4:2:2 10bit color -> PP 2025.  editing on a win11 workstation with HDR turned on (in windows and on the monitor).  I'm primarly editing for mobile/youtube. 

 

I am curious because my shots often aim at soft afternoon lighting, like the shot below. Being able to provide as much dynamic range as possible makes this effect better.  Would HDR 300 make those sunlights pop brighter on an hdr-capable mobile device?  Would HDR 300 give me more dynamic range? Is there a downside to selecting HDR 300 in the color settings for a project?

 

potsked_0-1761236413235.png

 

TOPICS
Editing , Formats
124
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Oct 23, 2025 Oct 23, 2025

forgot to mention, I record in Log, and use 4 different adjustment layers with tuned LUTS that give me fine grained color grading control.  The last of the LUTS is a rec709 conversion.  Typing this out makes me realize that I'm probably not exporting in HDR, so youtube wouldn't get the extra dynamic range that im recording and editing in....

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Oct 23, 2025 Oct 23, 2025

Re-reading your post, yes, with a LUT that does conversion to Rec.709, you are not anymore in HDR space.

 

That is an ... intriguing ... workflow. LUTs are fascinating and at times useful, but very limited tools.

 

When colorists can use algorithmic based conversions they tend to do so, as the algos are actually high-end mathematical computations that do have if-then steps built in. 

 

LUTs are rather simple data look-up tables ... FGH triplicate RGB values get changed to PWR ... those being stand-ins for numerical data in my example ... and in 16, 32, or 64 given steps. With the in-between numbers being typically a straight-line computation of changes.

 

No math whatsoever is involved in implementing LUTs. So ... if the file your are using isn't produced under the same circumstances as the file the LUT was designed to work with, it can and will crush blacks or clip whites or do odd things to sat. Careful application of specific LUTs is needed for specific field produced media.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Oct 23, 2025 Oct 23, 2025

Thank you for the detailed reply.  I agree that my workflow is unique.  It's a bit of a hack to avoid learning resolve.  I worked with a colorist and my own footage to develop a set of four adjustment layers (3 luts + a Rec709 conversion) so I can try to balance the creative looks and keep footage looking consistent even when I don't have great light.  Each adjustment layer is kind of acting like a node.

 

digging deeper, I am def clipping my whites, but on review, that is consistent with the soft afternoon look I'm going for.  

I am curious, what is the workflow or setting where you would define the brightness in nits of a particular highlight?  Or do you export within a certain color space and let the playback device interpret that for its own display.  For example, I know that many high end TVs might be able to hit 1000 nits of peak brightness which largley goes to waste since most recent movies are mastered for 300 nits max, keeping in line with the "theater experience".  Is premiere able to manage these kinds of color grading dynamics or is it too primitive a tool for that kind of thing?

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Oct 23, 2025 Oct 23, 2025
LATEST

Good questions to ask. First, colorists are taught you grade to The Standard, as you can never outguess the screens your content will be seen on. By adhering to the tightest calibrated/profiled grading situation you can, you do the best diligence to see that your media looks within normal professional media appearances on every screen out there.

 

And that's all you can do. No one will ever!!!!! ... under any circumstances, see exactly what you see while doing your color correction. That can never be the point, as it is physically impossible for your image to exactly appear on another screen.

 

Most TVs do a full remapping of content to their physical capabilities. Of course, they probably also throw in some crud in their attempt to "enhance the viewing experience". And for some parts of image management, do it quite well. For instance, upscaling is normally far less notable when done by a TV displaying 1280p content on a UHD screen, than by typical upscaling within an NLE from 1280 to 3840.

 

And the better screens at this time, for handling HDR, do a pretty good job of managing dynamic range within the screen's capabilities. But then, the majority of screens still don't actually do HDR, though that will change over the next couple years. And there is simply no way to out-guess what "most" screens will do with your content.

 

Realistically, most TVs can't produce much above 600 nits if they can actually hit that. And they tend not to be able to display full blacks if they're being tasked with much high-brightness pixel data. So, often, we can't have our cake and eat it too, to borrow an old saying.

 

A large part of the aisle-way discussions "in the background" among colorists, is what is actually the proper top end data to use at this time. And of course, how high capable a high-DR screen do you really need to have to grade properly. If the upper practical limit on peak brightness for the vast majority of high-end consumer devices is 600 nits, and your cl.ients are specifiying only getting to that, why do you need a 2,000 to 4,000 nit screen for grading? Or even 1,000 nits? Great question that!

 

The last conversations I've been through have been talking about how some Netflix type things are still aiming for peak nits to be up to 1,000 nits ... but that only for absolute image spikes on some tent-pole shows ... and in general, keeping most image content below 600 nits. Some of the folks doing broadcast, that is finally getting into HDR, say they have to keep everything under 600 nits at this time.

 

Corporate web specs tend to be 600 or less.

 

And again, the data between 200 and 600 is simply brightly colored light specks for the most part. A general comment about HDR is that it isn't the highlights that are so wonderful, it's expanding the shadows from 0-20 nits in SDR to 0-50 nits in HDR. As it allows much finer gradations of shadow tones than possible in SDR.

 

Multiple LUTs used essentially as nodes ... that is a most intriguing setup! And while I know one could build a process to do that, as it has been done of course, I'm glad a colorist built it for you. That process requires much testing and deconstruction of the effects not only of each LUT, but when used in specific processes against another LUT. The interactions would be amazingly complex. Even with a decade working both in Premiere and Resolve, and as much time as I've had listening to, and working with, major colorists, I wouldn't even attempt to build that workflow. (And yes, I'd rather some of my colorist acquaintances built that for me over others ... )

 

Color management and processes are incredibly complex, and as frustrating as they are complex. One thing to keep in mind is this: no camera or monitor ever captures or sees color. They only capture and display brightnesses. Period. And that applies to both film capture and digital.

 

Complex processes in capture, encoding, internal mathematical calculations and processes, and similar processes at the end, the display side, create brightnesses our eye/brain visual system interprets as color. Hues.

 

But every human has different color/hue/brightness sensitivities and capabilities. One can look at a screen and say this wall is tinting towards magenta, and the equally trained person next to them says no, it's toward green. Both are right, for their brain's functions, but when a colorist has them as the two clients sitting side by side in-suite during a grading session, this ... is a massive pain point.

 

 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Oct 23, 2025 Oct 23, 2025

No. And look at the option, what's it called?

 

"Graphics White" ... right?

 

The choice of 203 against 300 has no effect whatsoever on the video ... only on graphics items like text. And the standard for text in nearly all broadcast/streaming HDR work is right at 200-205 nits.

 

Why? Because that is where the upper limit of 'detailed video" sits. Above that, in the expectation of those making the standards, you have mostly speculars ... bright bits that may have color to them, which SDR cannot do at the higher IRE levels ... but do not have details in them.

 

Ergo the oft given thing, think a sheet of white typing paper in sunlight. That's your 'graphics white'.

 

Realistically, even grading in DolbyVision on Resolve or Baselight, nearly all colorists keep that detail limit at or slightly below 200. 

 

What's above? Oh, sun-speckled things reflecting the sun. Car tail and headlights. Reflections off water, glass, or metal. Fire. Candle flames, though some of the flame will probably be under 205 nits. Even skies at times.

 

And note, I work for/with/teach pro colorists. I'm around discussions on grading this by the top teaching colorists on the planet. It's a very misunderstood thing about HDR, and only part of the reason HDR is still pretty much the Wild Wild West of pro video. Most screens still do not do actual HDR, those that supposedly do have limits as to which HDR media they do work with, nearly all of those do really dastardly things to the screen "to enhance the viewing experience" which tick off colorists, and different apps do wildly different things anyway.

 

Which is frustrating because when it does get both created and displayed correctly, it's really, really nice.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines