Exit
  • Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
  • 한국 커뮤니티
0

Poor man's workflow for preserving HLG for online platforms that support it

Explorer ,
Dec 01, 2021 Dec 01, 2021

Let's say I'm a video enthusiast who wants to use Vimeo as my primary online distribution channel, AND I have a shiny new iPhone 13 Pro, that shoots Rec.2100 HLG for lots of things. And I want to edit this content in Premiere, and preserve the HLG in my AME-rendered output. Oh, and of course I will want to mix this HDR content with SDR content in the same project, and still have the HDR content look its best on Vimeo.

 

But let's also assume (correctly), that I haven't been discovered yet, so can't afford the hardware required to properly view and grade in HDR. What are the recommended best practices to accomplish this goal?

 

Vimeo supports HDR if you follow their rules (https://vimeo.zendesk.com/hc/en-us/articles/115015382768-Uploading-HDR-and-Dolby-Vision-videos). So if you upload properly constructed HDR content, a viewer with an HDR capable device (e.g. an iPhone 12 or 13) will see my work in all it's intended glory, and other viewers will see a SDR version, which Vimeo automatically transcodes from the uploaded original.

 

So, if I understand Premiere's new support for HDR, I think the workflow I need to get at least most of the way there goes like this:

1) Make sure my source footage is tagged properly with the appropriate color space. iPhone HLG footage is correctly tagged on import, as documented.

2) Use Rec.2100 HLG as my sequence's working color space. (To avoid HLG conversion to Rec.709.)

3) Make sure Display Color Management is enabled in prefs.

4) Make sure my project's "HDR Graphics White" setting is 203.

5) Use Lumetri corrections on the HLG content very sparingly, if I want to do some very minor color changes, such as tempature or tint changes to match other SDR footage in the project.

6) Export for rendering to AME, specifying HEVC(h.265), since it's an HDR format supported by Vimeo, and it's good enough quality for my purposes. Also choose the "HEVC - Match Source - HLG" preset, and appropriate encoding settings to achieve Rec.2100 HLG as the export color space.

 

What I've proven so far is that the simplest possible version of this workflow is successful, for a sequence with one iPhone HLG clip, and no Lumetri changes. The clip of course looks too bright in Premiere, but after rendering with AME the following is true:

1) It appears identical to the original clip when imported and viewed in Premiere.

2) It's metadata indicates its properly HLG.

3) It views correctly on Vimeo after uploading and compared to the original content on my iPhone.

4) It also views correctly back on my Mac after downloading from Vimeo and viewed side-by-side with the original.

 

So before I go further down this rabbit hole, it would be nice if someone could confirm that what I've described is the expected workflow to achieve my goals. If so, great, and of course, I realize it's impractical as soon as I would want to do any significant color editing/grading of my HLG content. But I'm thinking it would at least let me work with such content that doesn't need much color editing, and preserve it for platforms like Vimeo that are designed to support HDR.

TOPICS
Editing , Export , How to , Import
3.1K
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 1 Correct answer

Explorer , Dec 06, 2021 Dec 06, 2021

Well, no it doesn't sound completely correct, but it's no wonder why, with all the steps required and the two workflows I'm trying to test. It's pretty easy to misinterpret each other's descriptions of this stuff. 😉

 

My original post here was only about best practices for how to edit HLG content and preserve it on output, to give platforms that support HDR a chance to best display it.

 

With what I know now, I believe my only workflow error then was assuming I should use HLG as my working space. F

...
Translate
Community Expert ,
Dec 02, 2021 Dec 02, 2021

Might want to read this pdf.

 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Dec 02, 2021 Dec 02, 2021

Thanx much for this doc Ann. I had seen the 2020 version, via Larry Jordan's site, but it's good to see this updated version too. For future reference, is this a white paper that Adobe has distributed apart from the PP user documentation?  If there's someplace I can bookmark for future white papers, I'd love to do that. The closest user doc I can find seems to be the "HDR for Broadcasters" page.

 

I'll comb through this paper in detail to make sure I'm following its recommendations, and in particular, it answers some questions I also had about the recommended AE HDR workflow, so that's great too.

 

The paper seems to confirm much of what I found in my experimentation, but a key point I'm still confused about is the current support for iPhone HEVC HDR footage, which is Rec.2100 HLG. This is the key para:

 

"Premiere Pro allows you to work natively in Rec. 2100 HLG and Rec. 2100 PQ thanks to a new sequence working color space option. Apple ProRes and Sony XAVC Intra are both fully color managed and GPU accelerated throughout the HDR pipeline. With the new native HDR workflow you can import, edit, color grade, and export HDR content in Premiere Pro. The first implementation of this workflow addresses the needs of professional broadcasters and in upcoming releases, we plan to add support for other HDR working spaces and additional format support, like H.264, and HEVC."

 

This implies to me that Apple's HEVC/Rec.2100 HLG content might not work at all, but that's not what I found. As I mentioned, I was able to use it in my simple (i.e. no significant color editing) workflow, and the AME-exported result, uploaded to Vimeo, appeared the same on Vimeo as the original footage resident on the same iPhone.

 

So I wonder if this para really means that for now only ProRes and XAVC Intra are GPU accelerated, and that such acceleration will be addded for other codecs, like HEVC, eventually. That would jibe with my observation that my AME export settings to achieve HEVC HLG required software encoding (which is of course slower).

 

This might ultimately all be acedemic for many users, of course, since any quality HDR color editing/grading still requires such expensive additional hardware. But what about the use case where you want to use iPhone footage mostly "as is" color-wise, and just preserve it's HDR goodness in a project that includes SDR footage too? In that case, the working space could be HLG, SDR footage would automatically work too (per the paper), and the HLG output would be optimized for those environments that could take advantage of it.

 

Separately, the paper's discussion of input LUTs was interesting too, and also aligned with what I suspected would be very useful. An example is SDR sequences that want to include HDR footage too. I'm kind of surprised I haven't found anyone marketing these yet (beyond the BBC ones mentioned for broadcasters). If anyone is aware of these, I'd be interested in learning more abou them, Also, I happen to be a customer of previous LUTs from Paul Leeming, and he recently told me he's working on this very problem - iPhone-specific LUTs for color space conversion - but he doesn't have an ETA yet. So maybe it's just a hard problem, and/or 3rd parties need time to catch up to Adobe and Apple advances...

 

So I'm also wondering what Adobe's position is on this workflow of just supporting iPhone HLG footage for SDR destinations. Without input LUTs as a good starting point, it seems like there's a fair amount of Lumetri editing required to make HLG look good in 709. I can't be the only one wanting to use iPhone HLG footage in PP, and it's also possible I'm still missing something basic that helps optimize this process.

 

Thanx again for the paper reference, and any additional comments this mark spark.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Dec 03, 2021 Dec 03, 2021

The doc will soon be updated for 2022.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Dec 02, 2021 Dec 02, 2021

While much of the information of that pdf is still valid, there are a few things already outdated.

 

Such as the entire export section. They now have presets for HDR in HLG and PQ that are good starting places for work, and I would strongly suggest working with those as the basis for creating your own HDR export presets.

 

Yea, things are changing that fast.

 

As to working with HLG clips from your phone, that is easy to do accurately within Pr2022.

 

First, make sure the clips are HLG ... check the clip properties.

 

I have found that for some reason, even when using an HLG clip that is fully tagged and recognized by Pr2022 as HLG, if I use the "new sequence from clip" it will still make an SDR/Rec.709 sequence!

 

In that case, the HLG clip looks terribly blown out, and the Vectorscope may be blown to bits (metaphorically speaking ... ).

 

Simply go to the Sequence settings, and change the Sequence color space to HLG. I've found I need to manually shift the scopes to the correct color space also, and set the lower-right option for scope scale to HDR with HDR clips.

 

But I have quite successfully worked with HLG clips through export and re-import. It's pretty straight-forward once you've done it.

 

Neil

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Dec 03, 2021 Dec 03, 2021

Thanx Neil, for your thoughtful reply. Based on everything you said, maybe I'm doing everything right then, and my expectations just need to be reset re how much Lumetri work is required. Perhaps you can comment on that too, based on  your experience?

 

I had found the AME presets for HLG export - a variant of one of these seems to work properly for export.

 

And I think I've got my source and working space correct, and my Lumetri scopes seem to behave properly. And so far, all my iPhone HLG footage is being properly, automatically recognized correctly by PP.

 

So am I correct that with all these prerequisites, I should expect to see my footage overexposed initially? If I simply use Lumetri to reduce the exposure, I can quickly make the footage look close to matching what either my iPhone displays for the same content (modulo the expected difference in brightness and dynamic range, since it's an actual HDR device). But then I find I need to spend quite a bit of additional time in Lumetri to get the footage to come close to matching what both  QuickTime Player and the TV app display for the same content. (Yes, I know about the historic gamma issue for QT, and possibly for TV, but I claim their display should be a reasonable reference point for simply trying to use Lumetri to get close.)

 

And if you follow all this logic, this is why I was thinking the workflow might be ripe for an input LUT to help automate this process when you only want SDR output. I'm imaging an iPhone-specific HLG-SDR input LUT, that would result in less additional Lumetri fiddling. And when I mentioned this to Paul Leeming, he enthusastically agreed.

 

Thanx for any additional wisdom, and keep up your good advice work. I really appreciate it!

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Dec 03, 2021 Dec 03, 2021

Good questions.

 

First, whether you're working an HLG timeline or a Rec.709 timeline, all clips must have correct CM settiings in the bin. Period.

 

So an HLG clip on a Rec.709 sequence must be 'overriden' to Rec.709 with the clip properties in the bin.

 

Which is a ROYAL pain at the moment, as ... what if you want both a Rec.709 and HLG export?

 

Well ... say you stare with the clip overriden to Rec.709 for a Rec.709 sequence. Get everything looking good.

 

Now ... you have to create  DIFFERENT sequence for the HLG, and while working there, all clips need their CM set for HLG.

 

To export ... you have to make sure all the Rec.709 sequence clips are set to Rec.709, export.

 

Then ... reset everything to HLG, export the HLG sequence.

 

Supposedly you shouldn't need to do this, but from what I'm seeing "here" and testing myself, it is NOT handling CM correctly unless the clip and sequence and export all match perfectly.

 

You cannot currently take an HLG sequence and export to Rec.709 with any confidence.

 

So ... there are some problems to constantly test.

 

Bringing any clip whether it's been modded in CM to a different space, or simply say a Rec.709 clip on a Rec.709 sequence ... is going to need some adjustment. That's a given. There isn't any perfect camera through post exposure/contrast/saturation option out there.

 

Neil

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Dec 03, 2021 Dec 03, 2021

"So an HLG clip on a Rec.709 sequence must be 'overriden' to Rec.709 with the clip properties in the bin."

 

Really? Well if that's true, maybe I don't feel so stupid afterall. 😉

 

But are you sure? And if so, I claim that's just a bug, and a significant one. Do you know if it's being tracked as such?

 

And it would seem to directly contradict this, from the Adobe white paper:

 

"Premiere Pro automatically applies color space conversions on color managed media, when necessary, ensuring that colors are mapped from the source color space to the sequence working color space."

 

This description is exactly what I logically expect, and is consistent with what I recall other Adobe apps doing for a long time. (I know the video apps had a legacy of only worrying about RGB, so when they introduced CM support it ended up being necessarify different vs. what Photoshop, InDesign and Illustrator had to contend with - namely, color spaces other than RGB, like CMYK.)

 

But if you're right, and the paper is wrong, then how could the paper ever have been published? Conversely if the paper's right, then was this bug introduced recently, perhaps after an initial HDR implementation that worked like the paper describes? That would seem like a show stopper for touting the product as having HDR support.

 

Wait - or is the paper a vision of the ideal, and it just hasn't been fully implemented yet? That would make more sense, especially if Adobe has pushed this as a draft, but if so, the details in the paper seem awfully detailed.


So don't get me wrong - I'm not doubting what you're saying. I'm just not understanding it yet in the context of this white paper.

 

Separately, I realize this is complicated to try to follow by now, so let me summarize the two workflows I'd like to know the best practice workflows for:

 

1) Edit a mix of HDR and SDR footage in a single sequence, with a non-HDR machine (I have a 2019 5k Retina iMac, the last intel version prior to the M1's), and preserve the HDR footage for platforms that support HDR, like Vimeo and YouTube.

 

2) Edit the same footage/sequence combo, on the same machine, but target SDR output, for destinations that don't support HDR.

 

Naively, it seems like both of these should be common workflows. And if so, is it really as hard to accomplish as you've described?

 

Thanx for your patience Neil!

 

Tim

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Dec 04, 2021 Dec 04, 2021

That paper was written for the 15.x builds of the 2021 release ... which had major CM changes from the previous long-standing pattern.

 

But the 22.x builds of the 2022 release have radically changed setting options and behaviors.

 

And currently, it's very difficult to do many 'standard' things. Like use an HLG clip on a Rec.709 timeline, yes, a very common need. From that doc, and within now ancient behavior, Pr would auto-transform the clip to Rec709 on a Red709 timeline.

 

Proxies made from that clip should also appear correctly.

 

But now, it will not conform an HLG clip to Rec709, for my own tests and from a slew of other users. And if you simply use Lumetri to make it look fitting, when you export it may well not apply the Lumetri as you expect ... very dicey.

 

So to successfully use an HGL clip on a Rec709 timeline, you need to conform it via the Override to Rec709 option in the bin. Which works fine both for use on the sequence and export.

 

But then, make and use a proxy for that clip. The proxy is in the original HLG color space, and of course appears blown-out on the timeline. Without any way for the user to massage that clip back to proper CM. As we can't set any properties for proxy files.

 

And I'm struggling with the attempt to both make a Rec.709 and HLG output from the same sequence. It seems like we have to completely conform clips with the override to Rec.709 to make the Rec.709 output, but ... then manually conform all clips to HLG and redo color corrections for an HLG export.

 

Duping the sequence before making the HLG work, at least saves the color corrections done on the Rec.709 sequence. Though you need to redo the color on the HLG duped sequence of course.

 

But ... this is a nightmare as you have to remember to constantly change conforms on clips depending on working in HGL or Rec.709 at the moment.

 

Not ideal shall we say. And I would LOVE any responses by others especially if they get different behaviors. As it may do one thing for certain gear, kit, or media files, and something else with others.

 

Neil

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Dec 04, 2021 Dec 04, 2021

"Not ideal shall we say" - you are a master of diplomacy. 😉

 

I'm glad it at least sounds like these problems are known behavior then. Is an appropriate bug being tracked? If not, I'll be happy to figure out how to do so and create one...

 

Thanx for the great history and details! Since we've come this far, I'll do some additional testing, try to corroborate what you've described, and update this. (Prolly early next week.) Then maybe the unfortunate others who read this far might benefit from knowing they aren't the only ones questioning their sanity. 😉

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Dec 04, 2021 Dec 04, 2021

I've posted this thoroughly on their public beta forum. Including tagging a couple of the dev's I've met at NAB and MAX. And there's comments on the Uservoice also. Yes, they've "heard' about it.

 

What they're doing  ... and when we'll see that rolled out ... is never shared with us "outsiders" due to their legal department's insistence that as a publicly traded company they cannot attempt to influence future perceptions or whatever that is. Right.

 

So additional comments on the public beta, people going to the UserVoice and posting, plus definitely searching and 'upvoting' posts ... are all good things to do.

 

They work from metrics at the upper decision level, up above the actual teams themselves. And those are the metrics we users can affect.

 

Neil

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Dec 06, 2021 Dec 06, 2021

Ok, so I've done a bit more testing, starting with the smallest possible test case. And I don't think I can repro one of your key claims - that you have to conform HLG content to 709, in a 709 sequence. Here are my details:

 

I imported one 5 sec. clip of iPhone 13 Pro HEVC(h.265) HLG footage, and verified PP recognizes it as HLG via Properties and Interpret Footage.

 

I created a sequence from this footage and set its working space to 709. This makes sense in any realistic workflow on my machine, since I don't have the requisite HDR hardware, so any color editing I do will necessarily be a compromise. And while that's expected, I hope to also prove that even with this constraint, my edited and exported HLG' version is at least reasonably close to looking the same as the original iPhone HLG content on platforms that support HDR, like Vimeo.

 

I made sure color mgmnt. was enabled as a preference (and my iMac 5K Retina monitor has been recently profiled via my calibration sw).

 

I used Lumetri to edit the footage, getting a reasonable-enough visual match to what QuickTime Player shows for the same content. (Yes, this isn't an apples-and-apples test, but it's reasonable enough for testing this PP workflow.) And the Lumetri RGB Parade in HDR mode seemed to make sense while editing too.

 

I then exported the sequence to AME for rendering, to an HEVC format, with a modified version of the "Match Source - HLG" preset. These mods included making sure to use Software Encoding, to enable Main 10 as the profile, and HLG as the export color space. HDR graphics white was the default of 203. Logically, the color transformation pipeline should be HLG->709->HLG.

 

I imported the result and verfied PP saw it as HLG. I then added it to the original sequence, and initially it looked blown out in comparison to the original clip. After applying the same Lumetri effect to this new clip, it appeared identical to the original. And QTP's view of this result was also pretty close to matching its view of the original, so whatever it's doing to convert HLG to my display space seems to work correctly with AME's HLG result too.

 

So far, so good. I next simply duplicated the finished queue item in AME and modified it to export another version in 709 (color transformation pipeline should be just HLG->709), imported it to PP, verified PP sees it as 709, and added it to the same sequence. And it also looks visually identical to the HLG clips, which seems right to me. And this 709 result also looks correct in QTP.

 

So if you've followed this, my simple sequence now looks like this:

 

1) Original HLG clip, with Lumetri effect

2) Followed by AME-exported HLG version, with same Lumetri effect

3) Followed by AME-exported 709 version, with no Lumetri effect

 

And the Program Monitor seems to present identical results for each. The Lumetri scopes display minor differences, which I assume are likely due to compression/decompression artifacts, and/or other minor color conversion artifacts, after the round trip through AME.

 

Lastly, I uploaded both the original and AME-exported HLG clips to Vimeo, and viewed the results in both Safari on my Mac, and the Vimeo app on my phone.

 

Vimeo recognized both as HDR, but as expected, it recognized the original clip as "Dolby Vision" (via logo in upper left corner of it's viewer), and the AME-exported clip as just "HDR". (I'm pretty sure this would be due to Apple's licensing of Dolby Vision and Adobe not doing so in AME.) Visually, both versions are reasonably close to what PP shows wrt color, which means modulo whatever Vimeo and Safari due to process HDR, my original goal has been achieved.

 

I then also downloaded the AME-modified result from Vimeo to my iPhone and surprisingly, it looks visually very close to the original. (I'm not aware of a way to put them side-by-side at once, so I have to bounce back and forth between playing each individually in the Photos app, so this isn't a definitive test.)

 

And as a sanity check, I exported/rendered the sequence with all 3 clips again from AME, once with HLG and once with 709 as the working space, and the imported results in PP also looked identical, as I had hoped/expected.

 

This is all good news, and matches what I originally wanted to achieve. Yet it seems to not match your experience with using HLG in a 709 sequence, so please digest this and comment next on how what I've described may or may not be different than what you'd expect for this workflow.

 

You know, in your spare time. 😉 Thanx very much again for your patient indulgence here Neil.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Dec 06, 2021 Dec 06, 2021

I think there might be a misunderstanding of my comments ... I was saying you have to go through the modify/Interpet process if you were trying to use an HLG clip on a Rec.709 sequence ... and export as Rec.709.

 

Because in the previous practice many HLG clips could be simply 'dropped' on a Rec.709 timline, exported as Rec.709 after color work, and work fine. Currently, that seems not to work for many users.

 

Your process was staying in HLG, which as I noted, does work. If you export with an HLG preset, or after manually modding the export controls to get an HLG export via settings.

 

Does that sound correct?

 

Neil

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Dec 06, 2021 Dec 06, 2021

Well, no it doesn't sound completely correct, but it's no wonder why, with all the steps required and the two workflows I'm trying to test. It's pretty easy to misinterpret each other's descriptions of this stuff. 😉

 

My original post here was only about best practices for how to edit HLG content and preserve it on output, to give platforms that support HDR a chance to best display it.

 

With what I know now, I believe my only workflow error then was assuming I should use HLG as my working space. From my most recent results above, it makes sense now that the working space should be 709, since I don't have the hardware required to edit reliably in HLG. That of course necessarily means my HLG output from AME might be compromised, especially so the further away my color edits diverge from the original, but per my results when just trying to make HLG look reasonable in 709, they're surprisingly close to the original. So something fortuitous is happening in the HLG->709->HLG' transform, at least based on the tiny sampling of clips I've tried so far.

 

Along the way, I expanded the conversation to include best practice advice for the same source content (and possibly including SDR content too), targeting SDR output. And I now believe that the only change I have to make to achieve that is the target color space in AME. This is also what my testing results seem to prove, and seems to jibe with what you described as being problematic for many users.

 

I'm happy to believe maybe I'm getting lucky with just my iPhone 13 HDR footage, and my mileage may vary with other footage sources.

 

I think we're getting closer to converging on something, but any more comments? I very much appreciate them all!

 

 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Dec 06, 2021 Dec 06, 2021
LATEST

This ... kind of in-between time, perhaps ... period in using Premiere and gear for HDR workflows is a bit puzzling. And frustrating. They do need to get the rest of the controls that we have to assume are coming completed, and somehow ... more obvious and accessible for us users.

 

Until then ... it's this in-between sort of not totally time.

 

But yes, it's good to share workflows and attempts and such ...

 

Neil

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines