Copy link to clipboard
Copied
Hello,
Whenever I try to use Premiere Pro my CPU spikes to near 100 usage. GPU doesn't always but does often. The attached example is with one video clip playing with one effect applied. Video was shot on a GoPro Hero 6 at 4k, 60FPS. Also attached are my computer specs. Any help, advice, suggestions would be awesome.
Also I have the video rendering/playback set to Mercury Playback Engine GPU Acceleration (CUDA)
Copy link to clipboard
Copied
Adding the screenshots here to make it easier
Copy link to clipboard
Copied
Your computer is in good shape but you have to know that the type of media you work with has a massive impact on playback performance. I didn't see attached video but I know that GoPro footage is going to be in a compressed/interframe codec like h264/h265, which is not good for editing. Add that it's 4k and high framerate and that's going to further increase the computer power required to decode. Add an effect or color grading and that's going to add to it even further (depending on the effect it can have a huge impact.) Do you get a red render bar with the effect added? That's basically telling you that you cannot expect real-time playback without rendering first.
Solution: transcode to an intermediate codec (read: good for editing) before working with it, or make proxies into a low-res/low-bitrate intermediate codec and just toggle the proxies on. For the most part I work with proxies (rather than transcoding before-hand) and I tend to use the ProRes Proxy preset. In my opinion you don't need to make your own custom ingest/encoding presets unless you need/want more control (like adding a watermark/timecode) or you're working with a strange aspect ratio -- and even then, ProRes is usually pretty adaptable. (I say this last part because if you look up a tutorial on making proxies, depending on what you find you might think you have to create a custom preset, but you don't, and it's probably unnecesarily complicated if you're new to proxies.)
Copy link to clipboard
Copied
Now do you lose quality or anything in the final product this way? I'm still very new to video editing. Basically we want our music videos to look near-professional quality and are just starting to get into learning the best filming and editing techniques. Thanks a ton for your help.
Copy link to clipboard
Copied
near-professional quality
That starts with the camera and the one behind the camera.
there is a saying: garbage in garbage out.. You can never make it better then the source.
Copy link to clipboard
Copied
Point taken haha. I know what camera I'm looking at getting next but for now all I've got are the GoPros. I wanted to use them just to learn the ins and outs of Premiere Pro and video editing in general before dishing out a couple grand, but then ran into these rendering issues
Copy link to clipboard
Copied
You do not lose any quality, no. These are pretty standard workflows in a professional post-production setting. It's also referred to as an online/offline workflow. For more context you can read a bit here: https://en.wikipedia.org/wiki/Offline_editing or do your own google searches. Don't let it intimidate you though, the concept isn't too complicated and on a small-scale production it's super simple, especially these days (there's literally a button to toggle proxies on and off now in Premiere and it will only export the raw footage so you don't even need to worry about accidentally forgetting to toggle them off.)
In essence: you film awesome footage on an awesome camera and the media is too much for even a powerful computer to handle, so you create proxies so that you can do your "offline" edit. The proxies are lightweight versions of the original footage that will let you blast through the editing process without clogging your computer with all the extra data contained in your original media. The proxies have the same fps, same timecode, so at the end of the process (edit lock, where you don't need to worry as much about real-time playback and speed of navigating the timeline) you swap in the originals. That's when you would do color grading, VFX, audio mix, etc.
These days, this same kind of workflow is beneficial for even very new people shooting on low-end camera gear, because you're usually working with unoptimized codecs. So whether your footage is really awesome and your computer can't play it, or really crappy and your computer can't play it, the solutions are generally the same. There's sort of this narrow window of production where the cameras are shooting directly into an optimized intraframe codec that would be drag-and-drop editable. This isn't to say you have to transcode or make proxies all the time. People are editing all day every day without them. But you need to know the workflows exist and that you can employ them. Proxies are nice because you can create them at any point during the process if you start getting bogged down.
That's a lot more general context, but hopefully I've answered your question: quality should not be impacted. If you transcode prior to working you're transcoding to a codec/format that will retain the original quality, and if you make proxies you're vastly reducing the quality but only temporarily. Note that in both of these scenarios you're creating duplicates of your original media into a less lossy codec, which means bigger file sizes. That's another pro of using proxies, in my opinion, is that even though they are intermediate codecs they are smaller and when the project is done you can delete the proxies.