Skip to main content
Jeff Almasol
Adobe Employee
Adobe Employee
May 17, 2021
Question

Feature Focus: Body Tracker

  • May 17, 2021
  • 42 replies
  • 34790 views

Live-perform character animation using your body. Powered by Adobe Sensei, Body Tracker automatically detects human body movement using a web cam and applies it to your character in real time to create animation. For example, you can track your arms, torso, and legs automatically.

 

 

Get started now with an example puppet

  1. Download the Hopscotch_Body Tracking puppet.
  2. Open Character Animator (Beta), choose File > Import, and then select the downloaded .puppet file.
  3. Click the Add to New Scene button at the bottom of the Project panel to create a scene for the puppet.
  4. In the Camera & Microphone panel, make sure the Camera Input button () and Body Tracker Input button () are both turned on. If off (gray), click it to enable that input.
  5. Now try moving your body, waving your arms, and if you’re far enough from the camera to see your lower body, raising your feet. Body tracking is that simple!

Additional example puppets are available for you to try out.

 

Control body tracking

Puppets that respond to body tracking use the Body behavior. When a puppet track is selected in the Timeline panel, the Properties panel shows the Body behavior (among others) that is applied to the puppet. It is this behavior that allow you to customize how the Body Tracker works. For example:

  • Set When Tracking Stops to how you want the tracked parts of your puppet’s body to move when your body can’t be tracked for some reason, such as if your legs or arms move out of view from the camera. Hold in Place keeps the tracked handles where they last were before tracking was lost. The Return to Rest and Snap to Rest options, instead, will move the handles to the default state of the artwork, with the former option allowing you to control how fast it moves back with the Return Speed value, whereas the latter option moves back immediately.
  • Select the specific Tracked Handles on your puppet that you want to be controlled by your body movement. By default, just the wrists are tracked.

When you want to capture your body tracking performance to later export to a movie file or use in an After Effects or Premiere Pro project, you’ll want to record the performance (Help documentation on recording in Character Animator).

 

Set up your own puppet for body tracking

The Body behavior lets you control your puppet’s arms, legs, and torso by moving your body in front of your web cam. When you use the Body behavior, be sure to also have the Face behavior applied to control the position of the character and to track your facial expressions. You can also use the Limb IK behavior to prevent limb stretching, inconsistent arm bending, and foot sliding.

 

Setup

After importing your artwork into Character Animator, do the following steps to rig the puppet so that the Body Tracker can control it:

  1. Open the puppet in Rig mode.
  2. In the Properties panel’s Behaviors section, click “+”, and then add the Body behavior.
  3. Create tagged handles for the different parts of the body’s artwork so that the Body behavior can control their movements:
  • Arms: Move the origin handle to the shoulder area from where the arm should rotate; but do not tag it as “Shoulder” directly on the arm. You’ll want to tag your shoulders and hips one level higher in your puppet hierarchy – commonly the Body folder or the view folder (e.g., Frontal) – to keep your limbs attached during body tracking.
    Now, add handles at the locations of the elbow and wrist and apply the Left/Right Elbow and Left/Right Wrist tags to the handles.
  • Legs: Like the arms, move the origin handle of the leg to the hip area, but do not tag it as “Hip” directly on the leg. Add tagged handles for Left Knee, Left Heel, and Left Toe to the left leg and foot, and the Right Knee, Right Heel, and Right Toe tags to the handles on the right leg and foot.
  • Shoulders & Hips: The Shoulder and Hip handles should be on the body, not on the limbs. Select the parent group containing your limbs and add handles in the same shoulder and hip positions as the arm and leg origin handles. You’ll see green dots from the limbs where you can place them, or you can copy/paste the handles directly from the limbs to position them exactly. It should work fine either way. Tag these handles as Right Shoulder, Left Shoulder, Right Hip, and Left Hip.

 

Note: Body and Walk behaviors on the same puppet might cause issues because both want to control the same handles. If you do have both behaviors applied, consider turning off the Body behavior when animating walk cycles, and turning off Walk when doing general body animation. The Body behavior should work fine in side and 3/4 views.

 

Controls

This behavior has the following parameters:

  • When Tracking Stops controls how the puppet moves when the camera cannot track your body, such as if you moved out of view. Hold in place keeps the untracked body parts at the same place that they were last tracked. Return to rest moves the untracked body parts gradually to their original locations; use with the Return Speed parameter. Snap to rest moves them quickly to their original locations.
  • Return Speed controls how quickly untracked body parts return to their resting locations. This setting is available when the When Tracking Stops parameter is set to Return to rest.
  • Tracked Handles controls which of the tagged handles should be controlled by the Body Tracker. There are separate controls for the neck, the joints on the left and right arms, and the joints on the left and right legs.

 

Troubleshooting

If you’re getting unexpected tracking results, try calibrating the Body Tracker, as follows:

  • Click the Set Rest Pose button in the Camera & Microphone panel; when Body Tracker Input is on, Set Rest Pose will show a five-second countdown so that you can move back from the camera to get your entire body into view
  • Click the Refresh Scene button () in the Scene panel, when Body Tracker Input is on
  • Turn on Camera Input, when Body Tracker Input is also on
  • Turn on Body Tracker Input 

 

The overall positioning of the character comes from face tracking. As such, if you notice issues such as floating feet, choppy movement, and/or not enough horizontal movement, consider checking the following:

  • lighting on your face
  • starting head position when you click Set Rest Pose 
  • Face behavior’s Head Position Strength setting

 

As with face tracking, the darker lighting conditions in the room can reduce the quality of the tracking and introduce jittery movement. Try to have adequate lighting when tracking your body.

 

If the arms are not moving, make sure both of your shoulders are being tracked in the camera.

 

Known issues and limitations

Body Tracker is still in development.

 

In this first public Beta version (v4.3.0.38), please note the following:

  • Foot position might slide or jump around unexpectedly.
  • The Walk and Body behaviors do not cooperate well since they are both trying to operate on the same handles simultaneously.
  • When your body moves outside of the camera’s view, the bone structure overlay stays visible in the Camera & Microphone panel

 

What we want to know

We want to hear about your experience with Body Tracking:

  • What are your overall impressions?
  • Are you able to control your puppet’s movement the way you want?
  • Have you found interesting ways to use the Body Tracker?
  • Which Body behavior parameters and handles did you find useful or not useful?
  • How can we improve the Body Tracker?

 

Also, we’d love to see what you create with Body Tracker. Share your animations on social media with the #CharacterAnimator hashtag.

 

Thank you! We’re looking forward to your feedback.

 

 

(Use this Beta forum thread to discuss the Body Tracker and share your feedback with the Character Animator team and other Beta users. If you encounter a bug, let us know by posting a reply here or choosing “Report a bug” from the “Provide feedback” icon in the top-right corner of the app.)

This topic has been closed for replies.

42 replies

daniel paul 3999
New Participant
June 11, 2021

Hi,

I am hoping you will have Body Tracker see hand and finger shapes as well.

I need to make dozens of musical hand signs at different arm heights for nursery rhymes,

such as: OK, thumbs up, together in prayer, spider crawling, rain falling, etc..

Is there a way to do that now, w/out manually rigging?

 

 

oksamurai
Community Manager
Community Manager
June 11, 2021

Thanks for the feedback! This will not be for v1, but it's something we're thinking more about for the future. For now, you will still need to rig the different hand positions and trigger them. Or play around with third party stuff - someone on our team acually used Nintendo Switch controllers + JoyToKey on PC to make remote trigger buttons!

Participating Frequently
June 10, 2021

Hi, I'm enjoying the new beta.  One issue I am having is that when i move my characters arms (using my arms on the body tracking) the hands take a little while to orient themselves to the same angle as the lower arm.

 

Is there a setting somewhere that I can adjust this to speed it up a bit?

At the moment I bring the arm up to do a wave and that matches the speed of my arm on the webcam but then the hand is bent at 90 degrees and sloooowly rotates to be the correct way up to wave.

 

Probably something obvious, but it seems to be smoothing that movement and I can't find out where.

 

Thanks

oksamurai
Community Manager
Community Manager
June 10, 2021

Latency is a known issue in the current build. We are working on ways to fix this. For example, a poitential "smoothness" control that lets you control how responsive the camera is, with 0% being low latency but potentially more jittery, and 100% being smoother but having more latency.

 

If it's possible to send a video of what's happening, even if it's just a cameraphone shot of your screen, that would help us a lot. You can post here or DM it to me.

Participating Frequently
June 10, 2021

Thanks for the reply! 🙂

 

I've uploaded a terrible clip on my phone to youtube:

 

https://www.youtube.com/watch?v=4oKvgOHU0ck

 

Ignore the character's right hand behind his back, I had to hold the phone at the same time!

 

The issue I'm referring to is when I raise my hand, the camera tracks the character's left arm pretty well (and even better when I'm not trying to film at the same time!) but the hand (on its own layer) seems very slow to catch up. 

I understand that hands/fingers arent being captured yet, so I assume the hand layer is set to reorient itself to the same angle as the lower arm, but it seems very slow to get there! It might just be a slider or setting that I've done wrong?

 

Hopefully the video shows what I mean 

So for something like a wave, the hand ends up out of sync with the arm.

 

Thanks 🙂

Participating Frequently
June 7, 2021

Still only really successful using Body Tracker with the frog example puppet.  After putting some of your above suggestions to use, I was able to get "Maddy" almost there and "Dr. Applesmith" too, all but some funky leg gyrations and slippage now and then, so we're getting closer to some level of competence.

Known Participant
June 4, 2021

Just for fun, I took an excerpt of your anim and ran it through Warp Stabilzer in AE.  Mixed results;  too much stabilization on head and not enough on arms.  Attached is the side by side. FYI.

Scott

KenFabritius
New Participant
June 4, 2021

This is awesome!

 

Had a little trouble getting the body tracking set up. Then I watched your video. It's a behavior! OF COURSE! I added the behavior and it is working now!

 

I really like it!

 

Where/how do we deliver feedback?!

 

🙂

oksamurai
Community Manager
Community Manager
June 7, 2021

Thanks! You can share your feedback here!

Known Participant
June 4, 2021

The tracking is very responsive and accurate, but sometimes appears jittery. Is there a way to smooth the tacking after Record or to lower the sampling rate of the motion capture...?

 

Scott

oksamurai
Community Manager
Community Manager
June 4, 2021

Thanks for the feedback! This is something we are actively looking into - we have an internal parameter that can control latency vs. smoothness, but it's still a WIP. We're hopeful we can give users more control over this in the coming months, so please stay tuned!

Participating Frequently
June 1, 2021

Overall, the Body Tracking feature is a big leap forward for Ch. Anim. but so far the only puppet that moves flawlessly was your frog character, and even that had some issues with the feet remaining still unless moved by leg movements.  I just noticed your suggestions for setting up a character for Body Tracking in this forum, so I'll review them to see if there's anything else I can pick up outside of your initial video that will better adapt my existing characters for Body Tracking.  Thanks again for your efforts to make character animating accessible for art-challenged motion designers!

New Participant
June 1, 2021

This is amazing! Exactly what I was hoping for!
As a feature suggestion, the most useful thing for me would be having a "pose-to-pose" setting like there is for head/facial capture. Body Tracker produces some pretty floaty movement that doesn't suit some characters. Pose-to-pose works great on facial and head movement, and would make this much more usable for me.

 

Participating Frequently
June 1, 2021

First of all, this is an amazing addition -- it's been a ton of fun to mess with the past couple of days.

A question about body tracking the same puppet with multiple views. If I take a quarter-facing character and switch (using a swap set) from my (default) left-facing to right-facing, body tracking no longer tracks the tagged arms and legs in the right-facing puppet. I noticed that in your sample puppets for this beta you only rigged characters (like Ninja and Stardust) that typically have multiple views to just use their frontal view, so I this could be something you're already aware of -- but I wanted to check and see if I was just rigging this wrong. If this is a known limitation, and anyone else has a similar issue, the best solution I came up with (to at least somewhat keep the smoothness of movement) was to record everything on one puppet, then copy-paste it into a copy of that puppet with things switched around manually as desired (e.g. I have a frontal puppet where the right arm is in front of the left arm and vice versa).

 

Also, to Elvis' point (above) around the smoothness of the motion capture -- I think it'd be great if there was something like the Face Pose-to-Pose Movement toggle (but with millisecond increments). At a minimum, it'd help lessen the jitteriness my characters have -- but it would also help those of us that are trying to have characters with more snappy movements. Right now my workaround there is to record things at 0.5 or 0.75 speed so that things look a bit snappier on playback.


I'd also repeat what Elvis said regarding the ability to capture just hand movements in one take and just leg movements in another.

Again, thank you for putting this together and opening this Beta up to the community. I couldn't have made dancing characters without it!

oksamurai
Community Manager
Community Manager
June 7, 2021

Thanks for the feedback! Multi-view characters is a good question - I haven't tried that yet. You may have to add the Body and Limb IK to each view you want to use - I think the version of Ninja in the shipping version of CH does this with Limb IK for each view, for example.

Participating Frequently
July 13, 2021

Following up here, adding Body + Face + Limb IK to different views totally works! I tested this with a puppet that (when dancing) needs to sometimes have the right foot cross in front and other times have the left foot cross in front.

Known Participant
May 31, 2021

Tracking with the body works very well.  Kudos!  No issues so far.

I would like to make a suggestion:  Is it remotely possible to use audio as trigggers?  So that during a performance capture, you could say "switch" and the arm could switch to a different arm holding a prop.  Basically, it would be great to trigger puppet objects with a voice command the way you would with a keyboard command.  Possible?  Someday?

I'm really enjoying this build.

Scott