I had some clips that were recorded at 120 FPS that I brought into Premiere. Usually I right click them, choose Modify>interpret footage and then set the frame rate to 23.976. When I did this, the audio slowed down, but the video remained at 120 FPS (and played back in "Real time"). The clip got longer on the timeline, but where the last frame of the clip was in real time it would just be a freeze frame until the end of the clip on the timeline.
SOLUTION: what I figured out was that I had made proxies of the clips at 120 FPS. For whatever reason (a bug) if you have proxies on and you modify a clip, it does not modify the proxy (the video at least ... it does modify the audio which is even more confusing). If you turn proxies off you will see the clip is in fact modified and plays in slow motion (at 23.976 FPS).
Just wanted to drop this here as it confounded me for most of the day;)
Copy link to clipboard
Premiere's proxy routine is a bit odd in places. Such as the limitations is has on interpreted time media, like yours. Technically, it's not a bug, it's the way it was designed. I've had several long discussions with engineers at NAB over this. They've got internal reasons of course ... but as a user, those don't ... "fly" too well.
It doesn't do what we expect or want it to do for time-shifted media.
There's been several workarounds 'published' over time. Like this ...
But it would be nice if they'd just get the proxies to play matching how the original clip is set.
Ah ... I see what you are doing there ... thank you ... I mean obviously from a user perspective it seems like an easy fix "just code the program to also modify the proxies when you modify the source" but I know it's never that simple. It's amazing we've come this far, considering I was wrestling "dropped frames" injesting via firewire into Pinnicle Systems editing software 20 years ago.
Hi RAD Films,
Sorry. Do not use the Interpret Footage function for speed effects; instead, speed adjust the clips, and you can create proxies as usual. It sounds like you are struggling with a different problem with your proxies. I don't know of any way to use the camera-generated proxies in a Premiere Pro workflow. I hope the community might offer some advice for you.
Copy link to clipboard
thanks for posting this. Hopefully it'll help someone with the same probelm
Copy link to clipboard
This was the first issue I encountered when PP first launched Proxies. And within 5 minutes I lost the tiny bit of respect that I had for "Interpret Footage". It is a useful tool and better just to have it there in case you need it, (i.e you did a mistake with your camera seetings and looking for a quick fix) but my go to approach is to simply modify speed/duration on the timeline. This gives me more flexibility and much more control over things (somehow my head got used to calculating the percentage I need to slow down, rather than to speed up)
Copy link to clipboard
Yea, you would think that by now we'd be able to link with options as to say how to handle audio mis-matches. There are a lot of requests in for that ... so I hope it's taken care of sometime realistically soon.
Thanks for the workaround R Neil. I don't get why it doesn't just work - ie. encode the proxies at the original frame rate and interpret them at the same rate as the master clips. Seems like a bug to me. It would be great if Adobe just overhauled the proxy system, especially with 10bit H265 4K coming along. Give users a bit more freedom with audio etc. I have to use different templates for 8 channel audio from a camera like the FX9 and stereo audio from an A73.
This is a big pain in the bum. Especially when it comes to 4 channel audio out of my three Canon XF605s. For one, you can't create a proxy with 4 channel audio in a decent format to run under Windows (MP4), and second, my cameras generate the proxies while recording. That saves me time in post to render them and it is redundant footage. It doesn't make sense for me to render new proxies.
There should be a better resolution to this problem in the future.
Here's a workaround. First, as Kevin suggested, do not interpret the clips to adjust speed but use the adjust speed dialog. And here's the trick, rather than "attaching" the camera generated proxies, unlink the full quality camera original and relink to the camera generated proxies in Premiere. When you lock picture, unlink the proxies and relink to the full quality camera original... I spent several days tearing my hear out trying to generating proxies from a sony camera (maybe the FS7 - can't remember but can dig down to figure it out if you want) that had 8 tracks of audio generated in camera... Of the 8 channels of audio, I only needed the first 2 channels (actuallly there was no audio in tracks 3 - 8_). ...Although I could generate proxies within premiere, they wouldn't stay attached to the camera original clips... This solution worked just fine... But I'd do a test to make sure this workaround wil function correctly with your camera generated proxies. If you need help figuring out how to test this workflow, post back. and if I'm not being clear enough, just ask me to describe this workflow in more detail.
Thanks for the workaround. It is clear and easy to follow.
I have been attaching the camera-generated multichannel proxies and using Speed manipulation on the timeline. The real problem occurs when assembling the peace B-roll and calling the shots as you don't have that slow-motion preview in the Source monitor. It is hard to judge where the In and Outs should be. After upgrading the PC it is more bearable to scrub without the proxies and still use them for Multicam parts, but as soon I add the Input LUTs, things start to crumble. Premiere is crashing more often too.
There is a thing to be improved there. At the end of the day, the goal is to create stable and easy to use software that takes your focus from dealing with problems to creativity.
First, as Kevin suggested, do not interpret the clips to adjust speed but use the adjust speed dialog.
Is that the same thing though? When using adjust speed, Premiere will use a number of methods to try and adjust the speed of the clip. From frame sampling to optical flow. That is not the same as interpreting a clip shot at 50fps at 25P in a 25P timeline. And to make matters worse adjusting the speed leads to problems with hardware rendering clips on the M1 Macs.
Here is a video you may find interesting:
Kevin, thaks for pointing me to that video - it's super helpful. I still have reservations because it doesn't answer my question - it's not frame for frame. 59.97 @ 40% is not exactly 24 fps. It's 23.988. But for argument's sake let's say Adobe compensates for that by using a 23.98 timebase for your sequence and rounding it down to whole frames (I don't know if they do or not). I would argue the opposite to what the video proposes. Interpret footage is actually a much better tool (if it worked) than working off some random percentage speed. Because it gives a guaranteed result. If you interpret your footage @24fps it plays at 24fps. Basically it allows you to work in whole frames without Adobe trying to invent in between frames.
Plus Adobe themselves do this automatically for metadata flagged slo mo like A7SIII, RED etc. So if Adobe uses interpret footage - shouldn't it actually work properly? For footage like that, we now have to undo the interpret footage and then redo it as a speed clip. And we probably have to do that before creating proxies.
And none of this explains why ordinary 25fps footage when slowed down on the timeline gives glitchy jumpy results when rendered via hardware on the M1 processor. I've already lodged a bug/whatever request on this and of course the bug is still present 6 months later.
I just realised the other big gotcha with this method. You cannot use warp stabiliser on clips with the speed effect on them. Which was exactly the problem I ran into the other day. So now it's a Catch 22.
you can nest the clip and apply the speed change inside the nest and then apply the warp stabilizer effect to the nest in the seqpeunce but the results may not be great... given the complexity of what you're asking for... I've done it and sometimes gotten decent results, but kind of impossible to predict how it will look.
But it isn't complex at all. All we are asking the software to do is play back say a 50fps clip @ 25fps and then apply the warp stabilizer effect? It works perfectly well if you use interpret footage but breaks other things. Nesting doesn't seem to work - I've tried it and it seems to do strange things, at least on the M1 chip. I'll get things like a white bar in the middle of the first frame of the nest.
I come back to my original point - interpret footage should work for all workflows and Adobe just doesn't seem to care if there is a so called "workaround".
"I come back to my original point - interpret footage should work for all workflows and Adobe just doesn't seem to care if there is a so called "workaround"."
Easy enough as another user to see your viewpoint. But I can also understand where ... and why ... the engineers feel as they do. And it doesn't have anything whatever to do with not caring. As, having spent hours with quite a few of the engineers, they actually care quite a bit about the program. Their opinions can be quite different than ours, but it isn't from a lack of caring.
And I'm sure the engineers would take exception to your calling this a "workaround" when as far as they can see, things are functioning exactly as expected and designed, when used for what they were designed for.
We users often do something with a tool that it was never designed to do, but does in many cases cool stuff for us. Engineers look on that as ... odd.
And I think this an issue where as far as the engineers are concerned, they built a process for "X" and a process for "Y". Which they see as very different processes from an engineering viewpoint. Both processes are working correctly (to the engineers) for what they are designed to do.
And if some user chooses to use X tool to do Y, and it works at times, great.
But as that use is not what they designed it for. And as the "proper" designed process (as they can see it) works as expected, they don't see a reason to "fix X" because it doesn't always do "Y" well. Because again, to them, that is something it was never designed nor intended to do.
The user is essentially saying "But when I do Y with X, it breaks at times!" to which the engineer simply says "Don't use Y to do X." And, as far as they're concerned ... end of story. Because as far as they can see, they already have a good, working process for Y.
From our user standpoint, there are things about the interpret footage that work better for the work you're doing than a time/duration change. And that's where the disconnect comes in.
The engineers don't see that usage as necessary, as they have the process they intended for us to use.
I'll be at NAB again next week. And this is one of the issues I'll be arguing with the engineers about ... again. We users do get some things changed over time. But yea, it can be frustrating when the user & engineer viewpoints are polar opposites.
We'll just have to agree to disagree on this one Neil. Speed clip is not a substitute for interpret footage. Speed clip was originally designed to provide slow and fast motion effects before general use high frame rate cameras were even a thing. That's why it uses strategies such as frame sampling, frame blending and optical flow to "fake" slow motion. OTOH Interpret footage was always designed for this purpose. To interpret fields and adjust the actual frame rate of clips. To say users are using that to do things the software was never designed to do just isn't so. The software even does this automatically for clips with the frame rate flagged in the metadata. So it's definitely designed for that purpose. It's just one thing in a bunch of ignored errors and problems that is slowly pushing me away from Adobe.
Here are the problems I noted in my last job which held up rendering by 2 hours using RED Raw 6K on the M1 chip.
Speed clip causes glitches in rendering resulting in repeated frames and out of order frames when hardware rendering (reported to Adobe 6 months ago using the so called bug and features report). Nothing done about it. This happens on just normal footage with fake slo mo as well.
Software rendering RED RAW resulted in white frames (will report but haven't yet) rendering above workaround useless.
Rendering nested clips on timeline or sometimes just warp stabilized clips resulted in a white bar in the centre of frame on the first or last frame of a transition.
In the end the only way to get this to work was to abandon the problem warp stabilize effect and it resulted in a very unhappy producer (mainly because he had to wait around while I tried to fix the problems) and a very unhappy editor because I don't like doing things half baked.
I was having serious issues with a first generation m-1 macbookpro with speed changes. Try the beta of premiere, it solved a number of issues I was having...
John, you and I don't disagree ... it's the engineers that disagree. I've stood there and discussed this specific situation, and a number of other issues, with their upper engineers that come to NAB and MAX. It gets frustrating at times.
They're all editors themselves by the way ... so they actually all use the app for personal work. That's something many people don't think about. But their engineering viewpoint is, (from our perspective) putting blinders on them at times.
It's things like ... the way tools are organized on the screen. They can get totally caught up in the way the tools are organized in the code, and then insist that be reflected in the UI ... when a different on-screen organization would be much more sensible for the user. I've had several discussions over a number of those types of issues.
Talk about agree to disagree ... they mentally can't budge because they KNOW how the code is written. To them, and whatever "this" subject covers ... what it does now MAKES TOTAL SENSE. What the users ask is ... not logical to them because it doesn't match the pattern of the code or something.
And of course, you and I couldn't care less how the code was written! And yea, that's ... frustrating.
I've been in the Adobe booth while others argue similar things with engineers. And in the BlackMagic booth while people argue things with their engineers. Arguments with the engineers seem to always run the same pattern. Which is why it's such a joy when you get one who actually listens, asks questions, and posits potential suggestions on how to accomplish what "we" want while meeting his/her needs.
Dacia Saenz is one that is amazing to talk with. Some of the others ... probably going to be frustrating. And yet, when you come to them with the way to replicate a problem they've been struggling with, they will be all over working on it.
But the other thing that gets frustrating for us are the decisions made typically above the team level as to where the budgeted time goes. So the supers have X hours for certain types of work, and have to prioritize which bugs & annoyances get work, which ... don't at this time.
Several of the long-standing ones are pain points for me, I'm very aware of that. And I'll be after those issues again. Every once in a while, one of those does get fixed.
As to the M1s, that's ... a strange situation. Some things for some users absolutely scream, and for another M1 user, won't work at all. Some issues there to get sorted for certain.
I appreciate all your hard work on the forum Neil. I wish I had more positive things to say about Adobe because I love After Effects. But Premiere is just hard work.
These apps are all simply tools. Fancy hammers.
They all fit & feel differently for every user. And you simply need to figure out which one gets your work done in a way that is either least painful or most enjoyable for you while working.
There's decisions made over the years in the Adobe apps that irritate me thoroughly if I stop to think about them. But for me, the pain of working in the other option ... Resolve ... is brought home to me every time I'm in that app. More pain there than "here".
And most of the time, I rather enjoy being able to just fall into the work without having to think about the UI.
But if this ain't working, well ... it ain't working.
But good friends LOVE LOVE LOVE it. More power to them, that's the fancy hammer they should use.