Copy link to clipboard
Copied
Hi everyone,
Due to Covid we're in a tricky situation where we need to interview a bunch of scientists (approx 15 mins in total) for a web series - the problem is that the speakers are scattered across the US and we can't physically be in their homes to film them so the idea is to use CH to turn them into cartoon versions of themselves. Sounds straightforward (or so I thought).
First of all, we won't be able to get each of the speakers to install (and learn) CH to record themselves so instead we're sending them a camera on which to record themselves. This means we'll end up with footage of them which we’ll have to track in AE and subsequently export that data to CH. However, since these speakers are not accustomed to speaking to camera there are likely to be issues such as the face being turned entirely to the side or hands in front of the face, which won’t sit well (I assume) with the AE tracker. Does anyone here know of a workaround for this? A virtual web cam perhaps into which I feed the footage to feed back to CH? I’ve even considered playing the recorded footage back on a laptop and placing that in front of the webcam to allow CH to work directly.
Another question I have is whether the AE tracker is as accurate as the CH tracker? I ask because the CH tracker displays a lot more tracking points than I see with the AE tracker.
Has anyone here dealt with a project like this? If so, I’d love to hear/learn from you.
Many thanks.
Have something to add?