I am editing a various clips for a music video. Some of the clips were shot on a green screen others were done in a studio and on location. Before I synchronize the music (audio) with the videos should I convert the green screen clips first?
[Moderator note: Edited title typo for clarity.]
You can remove the green screen at anytime but I would probably do it at the start so that you can see the new backround behind it and see how it composes together, how it blends together.
Thanks for your collective feedback. I am working on it now and will update the thread on how things turnout. Each response has merit. I have a 2020 MacBook Pro, 1.4 GHz i5, 8GB Memory... it is definitely not high-octane but it gets my by.
Enjoy editing! 😉
I'm gonna take an opposite stance of Marek. At an early stage like that and when your focus is syncing, you don't want to be adding extra processing requirements to the video that are unrelated to the task you are trying to achieve. Depending on your hardware and the codec that you're working with, having effects like UltraKey on your footage might cause your video playback to struggle and make it difficult to focus on the sync. This also depends on how much manual work you need to be doing. If it was recorded with timecode or you're able to get an accurate sync with waveforms, then you may not have that much to do manually, but again - you might be struggling to play it when you try to check it (without rendering).
In a lot of workflows you're going to be doing the visual effects at or near the end of the process - for similar reasons as above. Effects slow things down, and you don't want to be slowed down while trying to make creative decisions about your edit. When things are locked into place and you don't need to worry as much about real-time playback performance, that's when you can start finessing those things. That being said, at a certain point -- maybe early on -- you may want to see what's going on with the keys, and maybe you actually need to the key to be done to even proceed creatively. So it's not always applicable to wait until the very end. To keep things running smoothly, the Global FX Mute can be helpful to toggle effects on and off, and again working in optimized editing codecs or using proxies will help.
In this complex craft, with these incredibly complex apps, there are many, many ways to do any one final "thing". In this case, I'm probably more with Phillip than Marek. But it always depends ... how 'strong' your machine is in the ways of the AdobeCC, what sort of media, how many other effects you'll be adding ... whether it's a Blue Tuesday ...
It depends on how your project is approached. Of course you can remove it at any time, but depends opn how many takes you have, how many different angles et.
I would sync the audio first, deciede which takes/parts will definetly not survive the final edit, and keep at it until i cannot deciede without the background, this way I reduce the amount of keying i need to do