I've noticed the new feature in Character Animator for commuting head movement from speech. It is commuting but the facial expesssions are a little off than what I expected. The nose, mouth, and eye brows are moving but without the head. I have it rigged like you normally would, maybe there is a different rig setup for this work correctly. Let me know your thoughts, I would greatly appreciate it.
I tried playing with this and got some wierd results. I am not sure under what circumstances it would be useful.
It would be nice to hear from the other pupeteers.
Be well fellow puppeteers, - Platty
If you have recorded voiceover, you can drop the mp3 or wav and auto sync the recording with your puppet so it automatically speaks. I'm doing majority of my work in Character Animator this way. They have recently added a feature right under that option called Compute Head Movement from Speech. Ideally, it's suppose to take the dialect from your speech and automatically give your puppet certain head movements based on the recording. Like you, I am getting weird results.
Yeh, for the time being I will wait for this to be fine tuned. I will stick with my camera to make the head movements.
Be well fellow puppeteers - Platty
Hey new to character animate, and i use the lip sync and the auto head movement. i have a podcast so i have to characters in my animation that i auto lip sync and then auto head move. The whole thing is over an hour long, and when i apply the auto head movement to each character its nearly 2 hours to apply (4 in total) but on each character when i play it back it only seems to work for about 11 minutes and then stops. Does this sound right or im i missing something, because if it takes that long just to apply it for 11 minutes of footage it seems extreme. Any advice would be appreciated.