Copy link to clipboard
Copied
Copied from a post by Francis Crossman, the product manager for Premiere Pro. Great advice, including how to switch off the function in your phone so you never have the issue until you are ready to embrace HDR.
Francis wrote:
"One thing that is tripping up so many people here is that the iPhone is shooting in HDR by default, and it has an HDR screen, so videos look phenomenal on it. High Dynamic Range video contains more light and color level than Standard Dynamic Range video (Rec709). Unless you have an HDR display on your computer (and have everything is set up properly), you will never see it the same way as on the phone. The vast majority of people have an SDR display. When you send the video to your computer, QuickTime player will do tonemapping while sending it to your SDR display so it looks decent. Premiere Pro does not have this capability yet.
Here's what's happening. PPro reads the metadata in the file, sees that it's HDR (HLG to be specific) and treats it that way. If you create a sequence from the file the sequence will be set up as HLG automatically. But your monitor is physically not capable of displaying the light levels in the file so that's why things look blown out. If you look at the scopes, you will see that nothing is actually lost. You could use Lumetri to grade the file down to SDR levels.
Here are a few options that I recomend. Choose the one that works for you:
Don't capture in HDR on your iPhone.
OR . . .
Override the colorspace of the files to Rec709
OR . . .
Actually work in HDR and create an HDR video
Hope this helps. HDR is legitimately confusing!"
Copy link to clipboard
Copied
Thanks for your comment. Regardless of whether the video is out of date or not, the solution worked for me, so I shared it. All the tech-talk in the world doesn't solve the problem. This work-around does.
Copy link to clipboard
Copied
@R Neil Haugen, I appreciate how active and generous you are in sharing your knowledge with us. Like jelliott2k, the fix in the https://youtu.be/cIgnyx5KXQM?si=DfgGUhRZL5FfB9B9&t=65 video by Bryan Adam Castillo works for me, but you say it's outdated. Let me explain my situation and how I arrived at the solution.
I too, shot video on an iPhone, not realizing it recorded in HDR. After importing to a Windows 11 PC, I initially just reduced the exposure in Premiere (now on 25.1.0 build 73) and thought it was fine. But then after downloading some of the same footage from iCloud, the colors were much different. I discovered that a standard iCloud download both converts footage to SDR, changes the colors to look good, and reduces the resolution to 720p. Unfortunately, there is no option to both keep the original resolution and have it convert it to SDR.
That sent me down a rabbit hole of trying to get HDR footage to match the colors of SDR footage that iCloud produced, and then to threads like this one and the aforementioned video. After setting the color space to Rec 709, I was able to come very close to matching the iCloud colors (which I like) and then saved the settings as a LUT.
I then compared my colors to those of the LUT used in Castillo's video. Neither perfectly matched the colors the iCloud produced. Mine has more contrast and more closely matches shadows and some colors, but overall his more closely matches the color grading from iCloud.
By the way, the steps in the video are slightly different in the latest versions of Premiere. After downloading Bryan Adam Castillo's LUT from the video, the steps are:
1. Import iPhone HDR footage into your project.
2. In the Project panel, highlight all the footage, right click and select Modify > Color…
3. Click the Input LUT dropdown and select Add LUT…
4. Navigate to the file HDR CONVERSION LUT.cube and double click it.
5. Click the Override Media Color Space radio button.
6. Click the dropdown to its right and select Rec 2020.
7. Click OK.
OK, having said all that, I'm curious what is outdated about that? Isn't it fine to use some LUT like his since you have to do color grading after switching to Rec 709 or 2020 anyway?
Copy link to clipboard
Copied
The QT workaround did in a minute what I've been trying to deal with for an hour. Thank you.
Copy link to clipboard
Copied
What "fix" were you going for?
Trying to get iPhone HLG media to seem normal, or ... to get the same look outside of Premiere on a Mac Retina (that doesn't have the proper reference video mode) as inside of Premiere?
The QT LUT by the way, is the worst option to deal with the later issue, if you're on Pr24.x. It's better to simply use the color management controls inside Premiere to adjust your Program monitor gamma to QT 1.96.
Then you'll see a very close image inside and outside Premire on your Mac. Now, understand, as the root cause is a poor choice of video display gamma by Apple, it's going to 'fix' this on your system, and Macs without the reference modes.
However, on everything else ... from Macs running on Reference mode with the HDTV Rec.709 setting to most PCs and all broadcast setups, you're probably gonna see that same exported file as pretty dark.
Because again, the issue is the display gamma on Macs without the reference mode option.
Copy link to clipboard
Copied
Sorry, my reply got inserted in the wrong place (it's before Scottodactyl's reply above). So just reply to this one with your answer.
Copy link to clipboard
Copied
LUTs are very simple charts, there's no math involved in applying them. Simply RGB triplicate X values become RGB triplicate Y values.
There are 16, 32, or 64 given points. In between the app simply makes a line.
Whike they have been useful for many things, they cannot be made flexible at all. LUTs can ... and will! clip and/or crush data outside the specific ranges they were built around using.
The transforms used by both Premiere and Resolve for color space transforms are algorithms that are complex and adaptable to the field produced media without ever crushing or clipping or over-saturating data.
Most colorists consider the algo-based transforms not only superior in results but far safer with less work.
So doing this with ultimate simplicity and accuracy is easy. Auto detect log, auto tonemapping, sequence set to Rec.709, for working in a typical moderately bright room using a viewing gamma 2.2.
Done.
The algorithm transforms are not intended to be a final look, like LUT transforms, they are a basic normalization from log->linear display space. So you take them as the starting point and build your final correction just as with LUTs.
Copy link to clipboard
Copied
Thanks for your reply. I would love it if Apple provided their algorithm for converting HDR to SDR colors, but they haven't and probably won't. Short of that, it sounds like you're saying that after setting the colorspace to Rec.709 and so on, one must manually color grade each piece of footage since they're all different, because a LUT cannot be one-size-fits-all. That would explain why more than a few people in the comments on that video said it didn't work, or looked not-great to awful.
For me, I tried his LUT and mine it on four different pieces of footage―one shot outdoors at night, one outside on a half-cloudy day, and two shot inside in two very different locations. They all looked pretty good and super close to what iCloud presumably used an algo for. I didn't have to do any adjustments after that. No doubt I'll find some footage that isn't helped by the LUT, but for now it has served me well and gives me a good starting point, if not a final one.
Having said all that, if you have some automated way to get the same colors as a standard iCloud export of HDR footage that converts it to SDR colors, I'm all ears!
Copy link to clipboard
Copied
I've been told those algorithms are complex mathematical coding that are based on the applications reading of media and hardware. They aren't as "portable" as a LUT.
I've not used any transform LUT which also involves a "look", that worked all that well as a "final" look of the image. In other words, one step to do transform and correction to sequence evenness of appearance and final Look.
There are three processes there:
The algos in Premiere and Resolve do a pretty good job of 1. above, when compared to transform LUTs.
Copy link to clipboard
Copied
With Premiere Pro 2024 I've had success by creating my project by selecting an HDR video from my iPhone 13 Pro to import when I first create the project. This sets up the project and sequence to match my original iPhone video format.
Then when exporting, I choose the "HEVC - Match Source - HLG" preset. This seems to give me a very good looking HDR video, just like my source, that looks great on my iphone.
Copy link to clipboard
Copied
Thanks for the help. This is always a problem for me. I need to try this with mixed video sources like iPhone, GoPro, and Android devices to make sure it works, but if it comes out looking good this will be the fix I need.
Copy link to clipboard
Copied
The first clip dropped onto a blank timeline to create a sequence establishes the sequence parameters for framesize, framerate, pixel dimensions, and color space.
Same with selected clip in a bin, create clip from selection.
Either way, the user can go into the Lumetri panel Settings tab, and set the parameters *you* want for color space.
It's up to the user to learn and utilize the options. I'm happy to help as yes it's complex and each situation may have different needs.
Itcwill get dramatically more complex in the bew Pr series due out in October. Working color spaces versus display color space versus output, with many options for all of them.
Find more inspiration, events, and resources on the new Adobe Community
Explore Now