Copy link to clipboard
Copied
Live-perform character animation using your body. Powered by Adobe Sensei, Body Tracker automatically detects human body movement using a web cam and applies it to your character in real time to create animation. For example, you can track your arms, torso, and legs automatically.
Get started now with an example puppet
Additional example puppets are available for you to try out.
Control body tracking
Puppets that respond to body tracking use the Body behavior. When a puppet track is selected in the Timeline panel, the Properties panel shows the Body behavior (among others) that is applied to the puppet. It is this behavior that allow you to customize how the Body Tracker works. For example:
When you want to capture your body tracking performance to later export to a movie file or use in an After Effects or Premiere Pro project, you’ll want to record the performance (Help documentation on recording in Character Animator).
Set up your own puppet for body tracking
The Body behavior lets you control your puppet’s arms, legs, and torso by moving your body in front of your web cam. When you use the Body behavior, be sure to also have the Face behavior applied to control the position of the character and to track your facial expressions. You can also use the Limb IK behavior to prevent limb stretching, inconsistent arm bending, and foot sliding.
Setup
After importing your artwork into Character Animator, do the following steps to rig the puppet so that the Body Tracker can control it:
Note: Body and Walk behaviors on the same puppet might cause issues because both want to control the same handles. If you do have both behaviors applied, consider turning off the Body behavior when animating walk cycles, and turning off Walk when doing general body animation. The Body behavior should work fine in side and 3/4 views.
Controls
This behavior has the following parameters:
Troubleshooting
If you’re getting unexpected tracking results, try calibrating the Body Tracker, as follows:
The overall positioning of the character comes from face tracking. As such, if you notice issues such as floating feet, choppy movement, and/or not enough horizontal movement, consider checking the following:
As with face tracking, the darker lighting conditions in the room can reduce the quality of the tracking and introduce jittery movement. Try to have adequate lighting when tracking your body.
If the arms are not moving, make sure both of your shoulders are being tracked in the camera.
Known issues and limitations
Body Tracker is still in development.
In this first public Beta version (v4.3.0.38), please note the following:
What we want to know
We want to hear about your experience with Body Tracking:
Also, we’d love to see what you create with Body Tracker. Share your animations on social media with the #CharacterAnimator hashtag.
Thank you! We’re looking forward to your feedback.
(Use this Beta forum thread to discuss the Body Tracker and share your feedback with the Character Animator team and other Beta users. If you encounter a bug, let us know by posting a reply here or choosing “Report a bug” from the “Provide feedback” icon in the top-right corner of the app.)
Copy link to clipboard
Copied
Does the blue playhead move while it's supposed to be recording?
Is the timecode (in the lower-left corner of the Scene panel) changing during this time?
Can you post your Audio Hardware preferences pane in Character Animator? The playback/recording system in Character Animator can be influenced by what the Default Input and Default Output settings are in the Audio Hardware preferences pane. Do the current settings look correct for your system? Does changing them make recording work?
Copy link to clipboard
Copied
Oh my goodness it certainly did. Thank you that was the trick, I can record -cool.
Copy link to clipboard
Copied
What are "tracked handles"?
I cannot find any documentation on this term...
Unless it's in the video above?
Copy link to clipboard
Copied
Select the puppet track in the timeline, then look in the Body behavior in the Properties panel. There's a Tracked Handles group that you can twirl opeen. There is mention of it in the first post in this forum thread, in the Controls section.
Copy link to clipboard
Copied
I used the body tracker for a small little animation yesterday (just torso + arms) - and had a couple of points:
- Being able to record takes separately for each tracking point/an arbitrary collection of tracking points, so that I can do things like do a take where I'm just focusing on the arms. Or at least being able to decide if a take should override everything or just the tracked points within it.
- When trying to perform 1 or 2 frame takes for pose-to-pose animation, the take is immediate. That means I don't have time to move away from my keyboard & stand in the correct pose - having an easily accessible way to add a timer (ideally with an arbitrary amount of time) would be incredibly useful.
- For whatever reason, when I moved my body across the cameras field of view, the puppet remained centered? I don't know why that is - and I think that wasn't the case in the preview video - but that's the only real tracking issue I ran into.
Copy link to clipboard
Copied
Oh - I forgot one thing - there was a significant sync issue with the tracks I made using the body tracker when recording along to pre-recorded audio - it was a good 2-3 seconds behind. It might be worth seeing if there's a way to add delay compensation - even if it's a manual setting.
Copy link to clipboard
Copied
Hi,
Thank you for your precious input! We will look into improving the recording workflow in the future release.
As for moving your body across the camera field of view, the overall positioning of the character comes from face tracking. Can you check if the face tracking is working fine and Face behavior's Head Position Strength parameter is set high enough?
Copy link to clipboard
Copied
That was great. I am planning to make a cartoon series already.
Special thanks @oksamurai and @Jeff Almasol
Copy link to clipboard
Copied
This body tracking looks so simple and amazing well-done team! I have a question about exporting - is it possible to save out Lottie files from Character Animator using the bodymovin plugin? I'm hoping to make animated stickers for Telegram.
Copy link to clipboard
Copied
Currently the data is not exported from CH in any way other than your character performance. We have heard a lot of requests for different export types since announcing this, and it's something we may explore in the future, but for now the tracking data is only used for the Body behavior inside CH.
Copy link to clipboard
Copied
I thought I wouldn't use this at first, but decided to give it a try. Seems like a great tool to add to everything else in the software, and sure beats manually dragging handles. Excited about its potential. If I can have an idea and whip out a 1 minute cartoon in a day, I will use this feature all the time.
So much depends on the puppet one is using, especially when it comes to making sure limbs don't flip. I tried this with a 3/4 character and it seems best to stand 3/4 to the camera so as not to have my arms break.
A frontal view character worked a bit better, though there was some jumping hand behaviours when hands were close to the chest.
I liked being able to turn off certain joints. Especially since there is foot sliding happening now, it was useful to turn off the legs and feet.
Maybe some kind of smoothing algorithm? I've only used a few motion capture programs (iClone's face tracking and iPiSoft), but the smoothing recently added helped both out so much. It takes away all that jitter. Sliders for how much smoothing on each joint would be amazing, so that you can exaggerate certain body parts. iClone does this and it really lets you tweak the performance. I turned my camera input down to 10 fps and that seems to have smoothed out a lot of the jitter I see in the 30 fps version, but I think a slider would be better so you don't miss small movements, just that you get the option to smooth them if you want to. 5fps was too little and missed a lot of movement.
Right now my character's shoulders seem to be be slipping in the tracking and that's affecting things very badly, with the shoulders broadening and narrowing. I wonder if that has to do with my clothing. A best practices guide of what kind of clothing to wear would be good. Would this actually work better with colourful tape put in spots on the clothing like actual motion capture suits, or is the software tracking in a different way?
Not sure if it's related to being the beta or some new features, but I noticed that some characters I had that rely on Layer Picker no longer work properly.
Copy link to clipboard
Copied
Thanks for the feedback.
For the issue with Layer Picker, would you be willing to share a .puppet file that we can investigate what might be going on? If so, you can send me a private message with a link to the .puppet file via Google Drive, Dropbox, etc.
Copy link to clipboard
Copied
I feel very compelled to second this motion. My motion tracking comes out jittery a lot and it would be just INCREDIBLE to have the ability to smooth the animations out from the mocap data!!! Please implement this!
Copy link to clipboard
Copied
I've jsut started to explore this so I'll likely have more to add as I go but here off the bat my frist note is have the webcam capture only a 4:3 feed feels much more restictive now. Being able to capture in wide screen would be awesome so I wouldn't have to stand 7 feet from my comp to get my full arm span.
Copy link to clipboard
Copied
What's your webcam's make and model?
Copy link to clipboard
Copied
Copy link to clipboard
Copied
The possibility to let puppets move is amazing! My main issue still is setting up the correct arm movement with Limb IK-behaviour. After hours of trial-and-error I almost give up. The bending, stretching and flipping of the elbows and shoulders seems impossible to set up. Especially the fact that -to move the puppet's arms- you naturally make a torsion/twisting movement to make an inside or outside hinge movement with your elbow. Is there a way to let Character Animator detect that twist movement? Maybe with a dedicated coloured shirt or some kind of stickers on the arms? Without the right movement detection I'm affraid it wil always look a bit strange and unnatural. I hope I've only missed the right way to do it, can anybody help me out?
Thanks!
Gert
Copy link to clipboard
Copied
I'm guessing you probably saw this already, but I go through the basics of LimbIK controls here: https://youtu.be/e85HGSAS80g
Currently arms in CH are treated as a single bendable mesh, so we don't yet have the ability to make your forearm rotately independently of the bicep. This is something we've talked about wanting to support for a while. For now, each character and setup is slightly different, so tweaking Limb IK to work just right for your situation can sometimes be tedious. In the linked video I tweak the bend spot at around 8:35 - that's normally how I set mine up.
If you can post a screenshot or video of what looks unnatural to you, and then what you want it to ideally look like or do, that might help us in future planning. Thanks for the feedback!
Copy link to clipboard
Copied
Tracking with the body works very well. Kudos! No issues so far.
I would like to make a suggestion: Is it remotely possible to use audio as trigggers? So that during a performance capture, you could say "switch" and the arm could switch to a different arm holding a prop. Basically, it would be great to trigger puppet objects with a voice command the way you would with a keyboard command. Possible? Someday?
I'm really enjoying this build.
Scott
Copy link to clipboard
Copied
First of all, this is an amazing addition -- it's been a ton of fun to mess with the past couple of days.
A question about body tracking the same puppet with multiple views. If I take a quarter-facing character and switch (using a swap set) from my (default) left-facing to right-facing, body tracking no longer tracks the tagged arms and legs in the right-facing puppet. I noticed that in your sample puppets for this beta you only rigged characters (like Ninja and Stardust) that typically have multiple views to just use their frontal view, so I this could be something you're already aware of -- but I wanted to check and see if I was just rigging this wrong. If this is a known limitation, and anyone else has a similar issue, the best solution I came up with (to at least somewhat keep the smoothness of movement) was to record everything on one puppet, then copy-paste it into a copy of that puppet with things switched around manually as desired (e.g. I have a frontal puppet where the right arm is in front of the left arm and vice versa).
Also, to Elvis' point (above) around the smoothness of the motion capture -- I think it'd be great if there was something like the Face Pose-to-Pose Movement toggle (but with millisecond increments). At a minimum, it'd help lessen the jitteriness my characters have -- but it would also help those of us that are trying to have characters with more snappy movements. Right now my workaround there is to record things at 0.5 or 0.75 speed so that things look a bit snappier on playback.
I'd also repeat what Elvis said regarding the ability to capture just hand movements in one take and just leg movements in another.
Again, thank you for putting this together and opening this Beta up to the community. I couldn't have made dancing characters without it!
Copy link to clipboard
Copied
Thanks for the feedback! Multi-view characters is a good question - I haven't tried that yet. You may have to add the Body and Limb IK to each view you want to use - I think the version of Ninja in the shipping version of CH does this with Limb IK for each view, for example.
Copy link to clipboard
Copied
Following up here, adding Body + Face + Limb IK to different views totally works! I tested this with a puppet that (when dancing) needs to sometimes have the right foot cross in front and other times have the left foot cross in front.
Copy link to clipboard
Copied
This is amazing! Exactly what I was hoping for!
As a feature suggestion, the most useful thing for me would be having a "pose-to-pose" setting like there is for head/facial capture. Body Tracker produces some pretty floaty movement that doesn't suit some characters. Pose-to-pose works great on facial and head movement, and would make this much more usable for me.
Copy link to clipboard
Copied
Overall, the Body Tracking feature is a big leap forward for Ch. Anim. but so far the only puppet that moves flawlessly was your frog character, and even that had some issues with the feet remaining still unless moved by leg movements. I just noticed your suggestions for setting up a character for Body Tracking in this forum, so I'll review them to see if there's anything else I can pick up outside of your initial video that will better adapt my existing characters for Body Tracking. Thanks again for your efforts to make character animating accessible for art-challenged motion designers!
Copy link to clipboard
Copied
The tracking is very responsive and accurate, but sometimes appears jittery. Is there a way to smooth the tacking after Record or to lower the sampling rate of the motion capture...?
Scott