Copy link to clipboard
Copied
Hello, great and powerful Adobe User Community. I've been spending some time trying to understand the purpose of drop frame timecode. From what I'm reading, I understand that DF is for broadcast television because for some reason the video plays back slightly slower when broadcast. So far, I haven't found an explanation for why that happens, but what I came here to ask is: does this hold true for digital broadcast as well? Or is this a holdover from analog television? I produce for online platforms, so I'm really asking out of curiosity (don't like gaps in my knowledge). If anyone can explain this to me, I'd appreciate it. Thanks!
https://blog.frame.io/2017/07/17/timecode-and-frame-rates/
Remember that the NTSC frame rate is 29.97fps instead of 30fps, which means that .03 frames are unaccounted for every second. Since timecode can only count in whole frames, after an hour there should be 30fps x 60sec/min x 60min/hr = 108,000 frames. Because NTSC is 29.97fps, after an hour there will be 29.97fps x 60sec/min x 60min/hr = 107,892 frames.
So there’s a discrepancy of 108 frames in NTS
...Copy link to clipboard
Copied
Drop-frame means that every X frames, one is dropped.
This is to keep overall progam length, as without it, the program could be up to a couple seconds longer for the broadcasted content. And must be exactly X minutes/seconds long.
The entire method of calculating framerates is amazingly complex. Including things like being actually something like 1001/1000 or something like that, when you see the actual math used. There are some good online sources for this topic.
And yea, I had to go through this a while back trying to prep a project for potential submission to our state public broadcasting system. Yowza ... it's ancient stuff, in many ways, doesn't really need to still be 'a thing', but, well ... "it's the way it has been so long it's not changing soon ... " was the most useful response I got.
Copy link to clipboard
Copied
I really appreciate the response. Still, why does video lose time when broadcast? I don't get it. And does this still hold true for digital broadcasts, or is this strictly a consideration for analog?
Copy link to clipboard
Copied
https://blog.frame.io/2017/07/17/timecode-and-frame-rates/
Remember that the NTSC frame rate is 29.97fps instead of 30fps, which means that .03 frames are unaccounted for every second. Since timecode can only count in whole frames, after an hour there should be 30fps x 60sec/min x 60min/hr = 108,000 frames. Because NTSC is 29.97fps, after an hour there will be 29.97fps x 60sec/min x 60min/hr = 107,892 frames.
So there’s a discrepancy of 108 frames in NTSC, which means that after one hour of real time, the timecode on your recording would be behind by 3.6 seconds (108 frames/30fps = 3.6sec). The timecode count would be 01:00:03:18.
Drop Frame Timecode works by dropping two frame numbers from each minute except every tenth minute. Your recording is unaffected because it drops frame numbers, not actual frames! Because it drops those numbers, at one hour in real time, your timecode will increase by exactly one hour.
It looks like this:
Logically, you use Drop Frame (DF) timecode when you shoot material at 29.97fps or 59.94i (59.94 interlaced) because it’s meant for TV broadcast. The general confusion around all these identical-looking frame rates means that sometimes people still refer to this as 30fps or 60i even though that’s technically incorrect. If you look back at the various frame rates and the standards they apply to, the only one left that is 30fps is ATSC, which is compatible with 29.97fps / 59.94i. 30fps and 60i are uncommon but unfortunately, some recording devices do record in those formats, so it’s important to make sure that your frame rate is exactly what you think it is.
Copy link to clipboard
Copied
So, video is not actually slowed during broadcast (as it's been explained in some places), it's just that the actual framerate for NTSC is 29.97FPS. Seems like an oddball number and I assume it must be dictated by physics/engineering, rather than an arbitrary choice. Would love to know how we got there. Since no one is telling me otherwise, I'm going to assume that this standard applies to both analog and digital NTSC broadcasts.
Thanks, folks. I guess I'll quit beating this horse for now. Lol. Thanks for the info. Greatly appreciated!
Copy link to clipboard
Copied
It's easy to find full explanations as the original broadcast signal had specific frequency needs.
And when you see the full metadata of a file in MediaInfo, it often even has the figure for exact framerate computation, which is like that 1001/1000 example I gave.
Copy link to clipboard
Copied
And of course, a change from 23.976 to 29.97 may easily require a cadence change, necessitating a 3:2 pull-down. Or the reverse. Check out video cadence changes and 3:2 pull-down.
AfterEffects has a good pull-down control.
Copy link to clipboard
Copied
Thanks, Neil. I really appreciate you taking the time to help me figure this out. For anyone else stumbling onto this thread, I did a search for "3:2 pull down" (had no idea) and found my way to this very excellent video that goes into detail about how we got to a 24fps film standard, and from there to our current broadcast frame rates (The History of Frame Rate for Film). It's a lot of math and loosely explained physics, but I feel way more knowledgable about this topic now. Thanks everybody!
Copy link to clipboard
Copied
There are a number of things required in the professionally held standards ... that made sense at the time they were created. But make no sense now. And are still "a thing" simply because it would take a lot of planning and joint action by a lot of people to make the change.
Like interlaced media, as another example "we" haven't touched in this thread.
The history is fascinating sort of. And also as a kind of what the ... ? .... thing. lol