If you watch people talking you'll notice they move their hands. Sometimes a little, sometimes a lot. People use their hands to accent, emphasize, or add nuance to their words. Also notice how each person uses their hands differently. Some may wave their left hand to the side when talking about an event in the past, while another might wave their right hand forward for the same thing. You could save us animators a _lot_ of time and effort if you could add hand tracking.
Is it possible to add hand tracking? Will it be too complex for today's computers? If so, could you record a video and have Character Animator go through it much more slowly to catch all the movements for a smooth animation?
We'd love to eventually include this. Currently most standard webcams are not going to be reliable enough to deal with this sort of thing, unfortunately - we've done tests! As technology improves (for example, depth-sensing cameras like the Microsoft Kinect), more complete motion tracking could be possible. I think this is a logical path you can expect CH to go in the future.
Thanks for the answer. I was afraid you might be too focused on faces to consider hands, much like Adobe is too focused on web visuals to consider adding web sound (ever try to make a children's web site about orchestras without good sound tools?)
How about if people use two web cams? Put the extra camera away from the main, get more of the person speaking in the frame, and have it focus on hands rather than the face? Both cameras could still use each other for 3D tracking or more realistic movements.
Yep, we're trying a bunch of things in top secret labs. Stay tuned.
How about full body tracking? (MoCap) I don't expect you to be working on this now, faces are hard enough as it is, but in the future do you think you could expand the process to cover cameras/videos of people acting out a scene then apply all the key body movements to the animated character?
Yes, I think this is entirely possible one day. But we'll see how technology evolves and what users are asking for the most.
If you want overwhelming user demand for full-body tracking just put up a video of someone dancing, then beside it insert a video of a puppet trying to do the same dance moves as input by hand (along with how long it took to animate), then between the two videos add a third where the app did all the tracking and show how much time was saved. It will be like auto rotoscoping. That will make it a "Must Have" request from animators!
It will be much quicker and more accurate to record a person dancing and have the app track it than to animate the same action by hand. And it will look a lot better. As a bonus feature, you'll be making animators around the world perform their own "Ministry of Silly Walks" sketches, much to the delight of their co-workers 😉
How long do you think it would take you to animate a puppet to do this? Even if the app could only handle main body moves (hands, feet, legs, arms) it would be a big time saver and allow you to focus on the details like how the skirt and hair move.
Agreed, that would be super cool. That does have a lot of custom artwork, the way the hands flip, the dimensionality as she sways, etc - it's a pretty big technical hurdle to make motion capture translate to something as polished in 2D as that. But we will see! I'm an optimist.
Most animators would be ecstatic just getting the major movements. That alone would cut the animating time in half or more. As for the hand flips and other small details you could put your hand up close to the camera so it fills the frame, tell Animate Character to Track Hand (or however you set it up) and act out the hand motions you want. Save it as a Right Hand action and apply it to the end of the right arm, then flip it over and apply to the left arm. Since adding that much detail to a puppet will slow down even the fastest desktop computer the animator could tell the app to animate just the arms and hands to make sure the timing of the hand gestures are correct.
Now if they could get VoCo to sing (VoCo + Music Notation = Singing? ) then you'd have something much better than Vocaloid and MikuMikuDance all in one Creative Cloud package.
Since most have ar capability in their phones today. Wouldnt that be the easiest way to make an app that makes it posible to use a phone together with character animator? /Mattias