Copy link to clipboard
Copied
I'm specifically referring to in the time interpolation settings. As in, whether Premiere calculates actual luminance based on the color space and then blends frames together, or just does a straight blend of the original values?
Specifically I'm interested in how it works for HDR sequences and footage.
Copy link to clipboard
Copied
Hey morphinapg,
Sorry for the trouble. Could you share screenshots so I can understand the issue better? Alternatively, will adding the Luma Key and Alpha Glow (or Directional Blur) effect on a duplicated clip over the original help you get a similar effect? Please let me know.
Thanks,
Ishan
Copy link to clipboard
Copied
Frame blending in Premiere Pro when using Time Interpolation > Frame Blending appears to be influenced by the Sequence Color Settings found under Lumetri Color > Settings.
Although @R Neil Haugen may have a better grasp on the intricacies of how HDR footage is blended with these settings.
Copy link to clipboard
Copied
I've tried to get a definitive answer to this exact question, but don't have one. So I'm very interested in getting one in this thread!
@Ishan Y ... I'll ping you as you're already in this thread. The question is one that needs a specific answer of processing order from the color team. Is frame blending performed early on, with original pixel data, or in the 'intermediate' transformed space if using a Wide Gamut working space, or in the final Sequence color space?
As it could be any of those three options, and there are some significant working differences between them in higher-end workflows. Where mistakes and surprises are incredibly costly. In money and careers.
Copy link to clipboard
Copied
Also, even if color space between clip and output match, whether blending is done not based on the pixel values, but on the values transformed into linear intensities. Like for example, in HDR, if you have a 10,000 nit pixel, and you blend it halfway with a black pixel, that pixel should then be 5000 nits. But if you are blending in the PQ space, then that pixel would be about 92 nits instead. Obviously, a pretty major difference. It seems when I drop opacity to 50%, that this transformation is happening in PQ space rather than linear nits, so my guess is frame blending is probably doing the same thing, but I don't know for sure.
I have a project where I am blending many frames together from a very high frame rate source to a lower frame rate output to essentially create a virtual motion blur, and if the intensities are not blended linearly, then this motion blur won't look quite right and I may need to come up with another solution. But frame blending is Premiere is very fast on the GPU, so I like it as an option.
Find more inspiration, events, and resources on the new Adobe Community
Explore Now