• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
3

Feature Focus: Body Tracker

Adobe Employee ,
May 17, 2021 May 17, 2021

Copy link to clipboard

Copied

Live-perform character animation using your body. Powered by Adobe Sensei, Body Tracker automatically detects human body movement using a web cam and applies it to your character in real time to create animation. For example, you can track your arms, torso, and legs automatically.

 

 

Get started now with an example puppet

  1. Download the Hopscotch_Body Tracking puppet.
  2. Open Character Animator (Beta), choose File > Import, and then select the downloaded .puppet file.
  3. Click the Add to New Scene button at the bottom of the Project panel to create a scene for the puppet.
  4. In the Camera & Microphone panel, make sure the Camera Input button (camera_input.png) and Body Tracker Input button (body_tracker_input.png) are both turned on. If off (gray), click it to enable that input.
  5. Now try moving your body, waving your arms, and if you’re far enough from the camera to see your lower body, raising your feet. Body tracking is that simple!

Additional example puppets are available for you to try out.

 

Control body tracking

Puppets that respond to body tracking use the Body behavior. When a puppet track is selected in the Timeline panel, the Properties panel shows the Body behavior (among others) that is applied to the puppet. It is this behavior that allow you to customize how the Body Tracker works. For example:

  • Set When Tracking Stops to how you want the tracked parts of your puppet’s body to move when your body can’t be tracked for some reason, such as if your legs or arms move out of view from the camera. Hold in Place keeps the tracked handles where they last were before tracking was lost. The Return to Rest and Snap to Rest options, instead, will move the handles to the default state of the artwork, with the former option allowing you to control how fast it moves back with the Return Speed value, whereas the latter option moves back immediately.
  • Select the specific Tracked Handles on your puppet that you want to be controlled by your body movement. By default, just the wrists are tracked.

When you want to capture your body tracking performance to later export to a movie file or use in an After Effects or Premiere Pro project, you’ll want to record the performance (Help documentation on recording in Character Animator).

 

Set up your own puppet for body tracking

The Body behavior lets you control your puppet’s arms, legs, and torso by moving your body in front of your web cam. When you use the Body behavior, be sure to also have the Face behavior applied to control the position of the character and to track your facial expressions. You can also use the Limb IK behavior to prevent limb stretching, inconsistent arm bending, and foot sliding.

 

Setup

After importing your artwork into Character Animator, do the following steps to rig the puppet so that the Body Tracker can control it:

  1. Open the puppet in Rig mode.
  2. In the Properties panel’s Behaviors section, click “+”, and then add the Body behavior.
  3. Create tagged handles for the different parts of the body’s artwork so that the Body behavior can control their movements:
  • Arms: Move the origin handle to the shoulder area from where the arm should rotate; but do not tag it as “Shoulder” directly on the arm. You’ll want to tag your shoulders and hips one level higher in your puppet hierarchy – commonly the Body folder or the view folder (e.g., Frontal) – to keep your limbs attached during body tracking.
    Now, add handles at the locations of the elbow and wrist and apply the Left/Right Elbow and Left/Right Wrist tags to the handles.
  • Legs: Like the arms, move the origin handle of the leg to the hip area, but do not tag it as “Hip” directly on the leg. Add tagged handles for Left Knee, Left Heel, and Left Toe to the left leg and foot, and the Right Knee, Right Heel, and Right Toe tags to the handles on the right leg and foot.
  • Shoulders & Hips: The Shoulder and Hip handles should be on the body, not on the limbs. Select the parent group containing your limbs and add handles in the same shoulder and hip positions as the arm and leg origin handles. You’ll see green dots from the limbs where you can place them, or you can copy/paste the handles directly from the limbs to position them exactly. It should work fine either way. Tag these handles as Right Shoulder, Left Shoulder, Right Hip, and Left Hip.

 

Note: Body and Walk behaviors on the same puppet might cause issues because both want to control the same handles. If you do have both behaviors applied, consider turning off the Body behavior when animating walk cycles, and turning off Walk when doing general body animation. The Body behavior should work fine in side and 3/4 views.

 

Controls

This behavior has the following parameters:

  • When Tracking Stops controls how the puppet moves when the camera cannot track your body, such as if you moved out of view. Hold in place keeps the untracked body parts at the same place that they were last tracked. Return to rest moves the untracked body parts gradually to their original locations; use with the Return Speed parameter. Snap to rest moves them quickly to their original locations.
  • Return Speed controls how quickly untracked body parts return to their resting locations. This setting is available when the When Tracking Stops parameter is set to Return to rest.
  • Tracked Handles controls which of the tagged handles should be controlled by the Body Tracker. There are separate controls for the neck, the joints on the left and right arms, and the joints on the left and right legs.

 

Troubleshooting

If you’re getting unexpected tracking results, try calibrating the Body Tracker, as follows:

  • Click the Set Rest Pose button in the Camera & Microphone panel; when Body Tracker Input is on, Set Rest Pose will show a five-second countdown so that you can move back from the camera to get your entire body into view
  • Click the Refresh Scene button (refresh_scene.png) in the Scene panel, when Body Tracker Input is on
  • Turn on Camera Input, when Body Tracker Input is also on
  • Turn on Body Tracker Input 

 

The overall positioning of the character comes from face tracking. As such, if you notice issues such as floating feet, choppy movement, and/or not enough horizontal movement, consider checking the following:

  • lighting on your face
  • starting head position when you click Set Rest Pose 
  • Face behavior’s Head Position Strength setting

 

As with face tracking, the darker lighting conditions in the room can reduce the quality of the tracking and introduce jittery movement. Try to have adequate lighting when tracking your body.

 

If the arms are not moving, make sure both of your shoulders are being tracked in the camera.

 

Known issues and limitations

Body Tracker is still in development.

 

In this first public Beta version (v4.3.0.38), please note the following:

  • Foot position might slide or jump around unexpectedly.
  • The Walk and Body behaviors do not cooperate well since they are both trying to operate on the same handles simultaneously.
  • When your body moves outside of the camera’s view, the bone structure overlay stays visible in the Camera & Microphone panel

 

What we want to know

We want to hear about your experience with Body Tracking:

  • What are your overall impressions?
  • Are you able to control your puppet’s movement the way you want?
  • Have you found interesting ways to use the Body Tracker?
  • Which Body behavior parameters and handles did you find useful or not useful?
  • How can we improve the Body Tracker?

 

Also, we’d love to see what you create with Body Tracker. Share your animations on social media with the #CharacterAnimator hashtag.

 

Thank you! We’re looking forward to your feedback.

 

 

(Use this Beta forum thread to discuss the Body Tracker and share your feedback with the Character Animator team and other Beta users. If you encounter a bug, let us know by posting a reply here or choosing “Report a bug” from the “Provide feedback” icon in the top-right corner of the app.)

TOPICS
Feedback

Views

19.0K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
replies 131 Replies 131
Adobe Employee ,
May 18, 2021 May 18, 2021

Copy link to clipboard

Copied

Does the blue playhead move while it's supposed to be recording?

Is the timecode (in the lower-left corner of the Scene panel) changing during this time?

 

Can you post your Audio Hardware preferences pane in Character Animator? The playback/recording system in Character Animator can be influenced by what the Default Input and Default Output settings are in the Audio Hardware preferences pane. Do the current settings look correct for your system? Does changing them make recording work?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 18, 2021 May 18, 2021

Copy link to clipboard

Copied

Oh my goodness it certainly did. Thank you  that was the trick, I can record -cool.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Aug 03, 2021 Aug 03, 2021

Copy link to clipboard

Copied

What are "tracked handles"?

 

I cannot find any documentation on this term...

 

Unless it's in the video above?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Aug 03, 2021 Aug 03, 2021

Copy link to clipboard

Copied

Select the puppet track in the timeline, then look in the Body behavior in the Properties panel. There's a Tracked Handles group that you can twirl opeen. There is mention of it in the first post in this forum thread, in the Controls section.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 19, 2021 May 19, 2021

Copy link to clipboard

Copied

I used the body tracker for a small little animation yesterday (just torso + arms) - and had a couple of points:

- Being able to record takes separately for each tracking point/an arbitrary collection of tracking points, so that I can do things like do a take where I'm just focusing on the arms. Or at least being able to decide if a take should override everything or just the tracked points within it.

- When trying to perform 1 or 2 frame takes for pose-to-pose animation, the take is immediate. That means I don't have time to move away from my keyboard & stand in the correct pose - having an easily accessible way to add a timer (ideally with an arbitrary amount of time) would be incredibly useful.

- For whatever reason, when I moved my body across the cameras field of view, the puppet remained centered? I don't know why that is - and I think that wasn't the case in the preview video - but that's the only real tracking issue I ran into.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 19, 2021 May 19, 2021

Copy link to clipboard

Copied

Oh - I forgot one thing - there was a significant sync issue with the tracks I made using the body tracker when recording along to pre-recorded audio - it was a good 2-3 seconds behind. It might be worth seeing if there's a way to add delay compensation - even if it's a manual setting.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
May 19, 2021 May 19, 2021

Copy link to clipboard

Copied

Hi,

 

Thank you for your precious input! We will look into improving the recording workflow in the future release.

 

As for moving your body across the camera field of view, the overall positioning of the character comes from face tracking. Can you check if the face tracking is working fine and Face behavior's Head Position Strength parameter is set high enough?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 19, 2021 May 19, 2021

Copy link to clipboard

Copied

That was great. I am planning to make a cartoon series already. 

Special thanks @oksamurai and @Jeff Almasol 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 21, 2021 May 21, 2021

Copy link to clipboard

Copied

This body tracking looks so simple and amazing well-done team! I have a question about exporting - is it possible to save out Lottie files from Character Animator using the bodymovin plugin? I'm hoping to make animated stickers for Telegram.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
May 21, 2021 May 21, 2021

Copy link to clipboard

Copied

Currently the data is not exported from CH in any way other than your character performance. We have heard a lot of requests for different export types since announcing this, and it's something we may explore in the future, but for now the tracking data is only used for the Body behavior inside CH.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
May 24, 2021 May 24, 2021

Copy link to clipboard

Copied

  • What are your overall impressions?

I thought I wouldn't use this at first, but decided to give it a try.  Seems like a great tool to add to everything else in the software, and sure beats manually dragging handles. Excited about its potential.  If I can have an idea and whip out a 1 minute cartoon in a day, I will use this feature all the time.

 

  • Are you able to control your puppet’s movement the way you want?

So much depends on the puppet one is using, especially when it comes to making sure limbs don't flip. I tried this with a 3/4 character and it seems best to stand 3/4 to the camera so as not to have my arms break.  

 

A frontal view character worked a bit better, though there was some jumping hand behaviours when hands were close to the chest. 

 

  • Which Body behavior parameters and handles did you find useful or not useful?

I liked being able to turn off certain joints. Especially since there is foot sliding happening now, it was useful to turn off the legs and feet.  

 

  • How can we improve the Body Tracker?

Maybe some kind of smoothing algorithm? I've only used a few motion capture programs (iClone's face tracking and iPiSoft), but the smoothing recently added helped both out so much. It takes away all that jitter.  Sliders for how much smoothing on each joint would be amazing, so that you can exaggerate certain body parts.  iClone does this and it really lets you tweak the performance.  I turned my camera input down to 10 fps and that seems to have smoothed out a lot of the jitter I see in the 30 fps version, but I think a slider would be better so you don't miss small movements, just that you get the option to smooth them if you want to.  5fps was too little and missed a lot of movement.

 

Right now my character's shoulders seem to be be slipping in the tracking and that's affecting things very badly, with the shoulders broadening and narrowing.  I wonder if that has to do with my clothing.  A best practices guide of what kind of clothing to wear would be good. Would this actually work better with colourful tape put in spots on the clothing like actual motion capture suits, or is the  software tracking in a different way?

 

Not sure if it's related to being the beta or some new features, but I noticed that some characters I had that rely on Layer Picker no longer work properly.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
May 25, 2021 May 25, 2021

Copy link to clipboard

Copied

Thanks for the feedback.

 

For the issue with Layer Picker, would you be willing to share a .puppet file that we can investigate what might be going on? If so, you can send me a private message with a link to the .puppet file via Google Drive, Dropbox, etc.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Mar 04, 2022 Mar 04, 2022

Copy link to clipboard

Copied

LATEST

I feel very compelled to second this motion. My motion tracking comes out jittery a lot and it would be just INCREDIBLE to have the ability to smooth the animations out from the mocap data!!! Please implement this! 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 24, 2021 May 24, 2021

Copy link to clipboard

Copied

I've jsut started to explore this so I'll likely have more to add as I go but here off the bat my frist note is have the webcam capture only a 4:3 feed feels much more restictive now. Being able to capture in wide screen would be awesome so I wouldn't have to stand 7 feet from my comp to get my full arm span. 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
May 25, 2021 May 25, 2021

Copy link to clipboard

Copied

What's your webcam's make and model?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 25, 2021 May 25, 2021

Copy link to clipboard

Copied

logitech 910

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 28, 2021 May 28, 2021

Copy link to clipboard

Copied

The possibility to let puppets move is amazing!  My main issue still is setting up the correct arm movement with Limb IK-behaviour.  After hours of trial-and-error I almost give up. The bending, stretching and flipping of the elbows and shoulders seems impossible to set up.  Especially the fact that -to move the puppet's arms- you naturally make a torsion/twisting movement to make an inside or outside hinge movement with your elbow.  Is there a way to let Character Animator detect that twist movement?  Maybe with a dedicated coloured shirt or some kind of stickers on the arms?  Without the right movement detection I'm affraid it wil always look a bit strange and unnatural.  I hope I've only missed the right way to do it, can anybody help me out?

Thanks!

 

Gert

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
May 28, 2021 May 28, 2021

Copy link to clipboard

Copied

I'm guessing you probably saw this already, but I go through the basics of LimbIK controls here: https://youtu.be/e85HGSAS80g

 

Currently arms in CH are treated as a single bendable mesh, so we don't yet have the ability to make your forearm rotately independently of the bicep. This is something we've talked about wanting to support for a while. For now, each character and setup is slightly different, so tweaking Limb IK to work just right for your situation can sometimes be tedious. In the linked video I tweak the bend spot at around 8:35 - that's normally how I set mine up.

 

If you can post a screenshot or video of what looks unnatural to you, and then what you want it to ideally look like or do, that might help us in future planning. Thanks for the feedback!

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
May 31, 2021 May 31, 2021

Copy link to clipboard

Copied

Tracking with the body works very well.  Kudos!  No issues so far.

I would like to make a suggestion:  Is it remotely possible to use audio as trigggers?  So that during a performance capture, you could say "switch" and the arm could switch to a different arm holding a prop.  Basically, it would be great to trigger puppet objects with a voice command the way you would with a keyboard command.  Possible?  Someday?

I'm really enjoying this build.

Scott

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jun 01, 2021 Jun 01, 2021

Copy link to clipboard

Copied

First of all, this is an amazing addition -- it's been a ton of fun to mess with the past couple of days.

A question about body tracking the same puppet with multiple views. If I take a quarter-facing character and switch (using a swap set) from my (default) left-facing to right-facing, body tracking no longer tracks the tagged arms and legs in the right-facing puppet. I noticed that in your sample puppets for this beta you only rigged characters (like Ninja and Stardust) that typically have multiple views to just use their frontal view, so I this could be something you're already aware of -- but I wanted to check and see if I was just rigging this wrong. If this is a known limitation, and anyone else has a similar issue, the best solution I came up with (to at least somewhat keep the smoothness of movement) was to record everything on one puppet, then copy-paste it into a copy of that puppet with things switched around manually as desired (e.g. I have a frontal puppet where the right arm is in front of the left arm and vice versa).

 

Also, to Elvis' point (above) around the smoothness of the motion capture -- I think it'd be great if there was something like the Face Pose-to-Pose Movement toggle (but with millisecond increments). At a minimum, it'd help lessen the jitteriness my characters have -- but it would also help those of us that are trying to have characters with more snappy movements. Right now my workaround there is to record things at 0.5 or 0.75 speed so that things look a bit snappier on playback.


I'd also repeat what Elvis said regarding the ability to capture just hand movements in one take and just leg movements in another.

Again, thank you for putting this together and opening this Beta up to the community. I couldn't have made dancing characters without it!

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Jun 07, 2021 Jun 07, 2021

Copy link to clipboard

Copied

Thanks for the feedback! Multi-view characters is a good question - I haven't tried that yet. You may have to add the Body and Limb IK to each view you want to use - I think the version of Ninja in the shipping version of CH does this with Limb IK for each view, for example.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jul 13, 2021 Jul 13, 2021

Copy link to clipboard

Copied

Following up here, adding Body + Face + Limb IK to different views totally works! I tested this with a puppet that (when dancing) needs to sometimes have the right foot cross in front and other times have the left foot cross in front.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Jun 01, 2021 Jun 01, 2021

Copy link to clipboard

Copied

This is amazing! Exactly what I was hoping for!
As a feature suggestion, the most useful thing for me would be having a "pose-to-pose" setting like there is for head/facial capture. Body Tracker produces some pretty floaty movement that doesn't suit some characters. Pose-to-pose works great on facial and head movement, and would make this much more usable for me.

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jun 01, 2021 Jun 01, 2021

Copy link to clipboard

Copied

Overall, the Body Tracking feature is a big leap forward for Ch. Anim. but so far the only puppet that moves flawlessly was your frog character, and even that had some issues with the feet remaining still unless moved by leg movements.  I just noticed your suggestions for setting up a character for Body Tracking in this forum, so I'll review them to see if there's anything else I can pick up outside of your initial video that will better adapt my existing characters for Body Tracking.  Thanks again for your efforts to make character animating accessible for art-challenged motion designers!

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Jun 04, 2021 Jun 04, 2021

Copy link to clipboard

Copied

The tracking is very responsive and accurate, but sometimes appears jittery. Is there a way to smooth the tacking after Record or to lower the sampling rate of the motion capture...?

 

Scott

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Resources