My laptop was upper-middle-end maybe 4 years ago. Premiere pro says there is some compatibility issue with the video card. Still, the program runs.
I've got seven podcast clips, 50 minutes long, that I am multicaming. I can do it, but it is mega laggy. The 1/8 and 1)16th options for the previews are not available, but I have taken off the master preview cam option (so it only shows the seven multicams select from).
Is there anything I can do to make it less laggy? I can deal with bad quality video in the preview, it's just that it hangs a lot.
Might want to consider using proxies.
Or convert footage to an edit friendly codec.
So I use Adobe Media converter to transcode the files to ndxhd. The files are already 4-5gb, I'm expecting to fill a terabyte with these new files.
I don't get why video is so resource heavy. '93 was the first handheld digital video recorder. Was that alien technology? 30 years later I can't run two instances of OBS without 16 gigs of RAM shutting down, and I can't multicam 7 videos. Not sure why I need a $10K rig just to post produce a home recorded podcast video, 2000 freaking 22.
As digital recording technologies and computers have evolved, so has video quality, and the demands of the compression techniques used to encode them.
There is more data being recorded in videos, even with OBS, compared to 1993 digital video recorders. To accomodate that data, more resource-intensive codecs are used. To add fuel to the fire, these newer codecs like H.264/H.265 are designed for delivery to minimize the space needed for the file, even further requiring more robust encode/decode processes.
So even though computers have advanced, so have the demands of the technology. Plus you say you're trying to decode 7 video streams at once? That will be a factor as well and why leveraging proxies and codecs optimized for editing (like ProRes or DNx) are crucial. It's important even when working with higher end machines too.
To answer your other question, you should be able to import the files into Premiere Pro, but if you are reading 7 streams at once of high bandwidth video, you may run into problems if you're using a slow HDD. Not only do you need to account for decoding, but also the bandwidth your drive can support in reading multiple video files at once.
Do you anticipate any issues importing 100gb+ edit friendly video files into Premier pro?
So where I'm at right now is I'm transcoding the files to NDxHQX 1080p 29.97.
For the next one I'll see if I can get OBS to output to NDx, otherwise I'll have to remux the mkv to a mp4, then transcode to NDx for 3 of the files. For the facecam, since I can't run two OBS without everything shutting down, we're using our phones. And one mov from iPad for room shot. Maybe 500gb of NDx files altogether, and one night of doing all the transcoding.
There an easier way to make the multicam less choppy?
You can use FFMPEG to record to ProRes or DNX straight out of OBS, but since those are high bitrate, high quality video codecs it will be a lot more demanding on your system to record them. It's up to you if you want to try to accommodate that, but I would personally just record into a decent quality H264 and plan to use proxies for your editing process. I would also recommend not remuxing with OBS, as the process of going from MKV to MP4 with their remux will introduce Variable Framerate, which is like poison to editing software and is giong to cause a lot more headaches for you. If you record directly into the TS container rather than MKV you'll get all the same pros as the MKV container without having to remux. You can import the TS files directly into Premiere.
On the note of Variable Framerate, keep in mind that while phones can look great, they record very problematic video when it comes to working with it in post production. Phones do record into variable framerate, which means you're going to have issues with audio syncing (which you absolutely do not want if you are trying to sync up multiple streams over a long period of time), not to mention other performance and errors that you get from the VFR. Particularly with 4K on a phone you'll usually be recording in HEVC, which is also a very unoptimized editing codec. And also be sure to turn off HDR settings on the phones as that will be another layer you'll need to address in post production. Often to get phone media into a good and stable shape you need to transcode it to correct the variable framerate at minimum, if not also getting it into a better editing codec and changing the color space at the same time.
In general the answer to your question has already been stated by others: proxies.
Playing 7 streams of video simultaneously is a lot of work for a computer, so you want that video to be super easy to work with. Also note that the playback resolution pulldown menu is going to have a minimal effect on the performance here because your computer still has to do the hard part - decoding the video - before it can play it back at a lower resolution.
Edit: I also wanted to address a question you posted earlier that I didn't see answered. Importing 100GB+ videos to Premiere is not an issue. File size in general is not really an indication as to whether it's going to be easy or difficult for Premiere to play, and in fact if you are going to try to make an assumption based on file size it would tend to be the opposite of what you think. Small file size = efficiently compressed and requiring more computing power to decompress (play). Whereas a larger file size would indicate an intraframe video codec (every frame exists individually), which is a lot easier for the computer to "understand."
Much appreciated! I will look into proxies tonight. We have 8 webcams sitting around, but our computers can't handle multiple OBS's so we had to resort to our phones for the face vids. For next time I'll check out the HDR setting and turn it off, but for now I'm transcoding the whole lot to NDx
I'll also look up what the TS container is, that I should use instead of mkv, I don't remember seeing that setting. Are TS files edit friendly like NDx? ... I mean will they allow for smoother Multicam? Or does Multicam generally only work well (with seven 50min long files) with NDx?
TS is just a container, like MKV, MP4, AVI, FLV, etc. The video codec inside of the container is H264 (so no, it isn't edit-friendly like DNX, but you can still make a proxy of it into something edit-friendly.)
When you Remux (also called Rewrap) you are just taking the video and audio codecs from inside of the container and putting them into another container. If it helps you can think of it like taking things out of one box and putting them into another box.
The container doesn't have a whole lot to do with the performance/optimization of the codec inside, except that you can maybe infer what the video codec might be inside the container (often when people say "it's an mp4," they are essentially saying, "it's h264.") But take the .MOV container as an example. Inside an MOV container you could have the HEVC/H265 video codec, which is terrible for video editing, or you could have ProRes, which is great for video editing. So the container or file extension does not paint the full picture and it isn't the same as the codec.
Thanks! Well, Premiere Pro is starting to not look like the right program for me. I've got a laptop, maybe 4 years old, paid $2400 maybe. Only 8 gigs of ram. Still, I should be able to put together a lowest-fi 7 video podcast without my whole computer shutting down. Probably not going to blow 10k on a new rig just to be able to edit a podcast vid in Premiere Pro. Dig out kdenlive next.
I transcoded the files to NDx, which definitely made multicam run smoother for a bit, but then the whole thing repeatedely maxed out the harddrive, sometimes the CPU, and the video for the most part ran out of sync with the audio. The mp4 version of multicam was at least workable. Also, I tried the syncing option, it couldn't do it. After I sat there for a long time while it was seemingly working, I cancelled, and then it gave me an error saying it couldn't find the soundfiles. I tried repeatedly, even after manually syncing up all the files, and it failed. Manual syncing isn't accurate, wav forms aren't detailed like a daw, and you can't zoom in enough or place the wav form accurately enough for the difference not to be noticeable. No way are people manually syncing up videos using the audio.
Hit the mp4 version again, see if those work, then quit.
I really think that you're underestimating and/or just underappreciating the complexity of what you're trying to do. There's nothing simple about a 7-stream multicam edit. I know you're just trying to be hyperbolic about the computer situation but the reality is that it doesn't matter what computer you're using, or even what editing software you're using, for that matter. Playing 7 streams of video at the same time is a huge demand on system resources.
DNxHD/HR is a high quality, high bitrate codec. If you are trying to play 7 streams of that simultaneously you're going to max out the I/O speed on your drive unless you have it all on a high speed drive or RAID array. That's just the way it is. That won't change no matter what editing software you use.
H264 is a delivery codec. It's meant to be the final stage of the video creation process, played forward at a standard speed in a buffered video player or streamed. H264/5 is a Long-GOP codec, which means that it's comprised of large groups of pictures, and only some of the frames (called i-frames) are actual images. The rest of it is pixel and color-tracking data that's temporally compressed over the large group, which means that when you put your playhead in Premiere (or any editing software) on a frame of H264, you're not just looking at one frame, your computer actually has to decode up to dozens of frames of non-image data just to "rebuild" what that single frame looks like. It's a complex blueprint to piece an image together for every frame you look at. Nothing simple about that.
Now do that 7 times every single frame.
H264/5 isn't something Adobe invented. This is a licensable video codec just like all other video codecs (some are open source). So it's not like you just go over to another editing software and all the sudden H264 ceases to be H264. Video codecs are what they are regardless of your NLE. (What can be different is how the NLE works with your hardware. FCX is highly tuned for Macs, for example, and Resolve is newer and can take advantage of your hardware in different, more efficient ways - although it does demand more hardware.)
All of this is to say that you can moan about Premiere all you want, but it mostly just looks like you don't have your head wrapped around what the media is, what video codecs are, bitrate, etc., and again these are things that apply to video editing no matter which editing software you use.
Would a $10k computer let you edit a 7-stream MC in Premiere? Probably not. Because again it's not just about the hardware. 7 streams DNX or ProRes is still going to be too high bitrate for a single HDD, and 7 streams of H264 and H265 are still going to bring a modern computer to its knees trying to decode those video codecs. (With the right workflow, though, you can edit some pretty high end content on a pretty low end computer.)
So again - what is the solve? Proxies. You have your original H264/5 media that doesn't play well. You create low bitrate, low resolution proxies into an optimized video editing codec that does play well. You edit with those proxies and at the end you export the original source media. This is a workflow that happens all the time. Almost all high end projects are going to be doing this. Every TV show you watch, every movie, even most of the commercials, it all is going to use some form of online/offline or proxy workflow. I work with proxies for 99% of everything I ever do even when my source media is all optimized to begin with - because it's efficient.
Edit: I also want to add once again that anything you have that's variable framerate is going to add a whole other layer of issues on top of the video codec. If you don't process that media properly you can't get constant framerate media that maintains audio sync. They don't tell you that in the phone commercials.
Also, regarding syncing audio: sometimes it's manual. It depends on the project. A well-run production will have the timecode jammed on all the recording devices, which means syncing them up is just a couple of clicks and no processing. There are audio synchronization tools that can help sync up audio by waveforms, but that can take a long time and doesn't always work - especially if you have a bunch of different tracks, different sample rates, etc. At the very least you would want a clap board or even just a clap at the start of a take so you have a visual reference to sync up all the waveforms.
Philip, thank you, I appreciate the insite. My goal was to quickly hack together some Lo-Fi vids, so I asked how to do it, and the response was the multicam, which unexpectedly shut down my computer.
I can appreciate the complexity of the Art, and acknowledge my complete lack of understanding. I am a musician. It would be handy if the program recognized the fact that I am an idiot attempting to multicam 7 ndx files and that my hard drive is literally melting, and then suggest proxies, or even force it, and also recognize and suggest the proper way to process variable framerate media to maintain audio sync etc. I'm sure the typical Premiere pro is aware of these things.
So on my agenda for next time are proxies and variable framerate media and the correct way to process them. Also what kind of vid is in the container box coming out of the phones. Also, a big cowbell hit before starting.
Yeah video editing is a nice balance between creative and technical. And this stuff is the technical part.
Phones are usually going to record H264 or H265. If it's H264 it's usually going to be in an MP4 container, and if it's H265 it'll usually be in a MOV container. Usually if it's a 4k recording it'll be H265 in an MOV.