Copy link to clipboard
Copied
Live-perform character animation using your body. Powered by Adobe Sensei, Body Tracker automatically detects human body movement using a web cam and applies it to your character in real time to create animation. For example, you can track your arms, torso, and legs automatically.
Get started now with an example puppet
Additional example puppets are available for you to try out.
Control body tracking
Puppets that respond to body tracking use the Body behavior. When a puppet track is selected in the Timeline panel, the Properties panel shows the Body behavior (among others) that is applied to the puppet. It is this behavior that allow you to customize how the Body Tracker works. For example:
When you want to capture your body tracking performance to later export to a movie file or use in an After Effects or Premiere Pro project, you’ll want to record the performance (Help documentation on recording in Character Animator).
Set up your own puppet for body tracking
The Body behavior lets you control your puppet’s arms, legs, and torso by moving your body in front of your web cam. When you use the Body behavior, be sure to also have the Face behavior applied to control the position of the character and to track your facial expressions. You can also use the Limb IK behavior to prevent limb stretching, inconsistent arm bending, and foot sliding.
Setup
After importing your artwork into Character Animator, do the following steps to rig the puppet so that the Body Tracker can control it:
Note: Body and Walk behaviors on the same puppet might cause issues because both want to control the same handles. If you do have both behaviors applied, consider turning off the Body behavior when animating walk cycles, and turning off Walk when doing general body animation. The Body behavior should work fine in side and 3/4 views.
Controls
This behavior has the following parameters:
Troubleshooting
If you’re getting unexpected tracking results, try calibrating the Body Tracker, as follows:
The overall positioning of the character comes from face tracking. As such, if you notice issues such as floating feet, choppy movement, and/or not enough horizontal movement, consider checking the following:
As with face tracking, the darker lighting conditions in the room can reduce the quality of the tracking and introduce jittery movement. Try to have adequate lighting when tracking your body.
If the arms are not moving, make sure both of your shoulders are being tracked in the camera.
Known issues and limitations
Body Tracker is still in development.
In this first public Beta version (v4.3.0.38), please note the following:
What we want to know
We want to hear about your experience with Body Tracking:
Also, we’d love to see what you create with Body Tracker. Share your animations on social media with the #CharacterAnimator hashtag.
Thank you! We’re looking forward to your feedback.
(Use this Beta forum thread to discuss the Body Tracker and share your feedback with the Character Animator team and other Beta users. If you encounter a bug, let us know by posting a reply here or choosing “Report a bug” from the “Provide feedback” icon in the top-right corner of the app.)
Copy link to clipboard
Copied
1. Puppets exported by the Beta have an error when imported to the regular app/program.
2. I'm a new user and unfortunately I jumped right into the Beta and created a bunch of triggers. How do I migrate them from the Beta to the regular app? The triggers were not created using camera control/body tracking.
3. I found that the body tracking, while useful, was very jittery. Even at high smoothing levels, movements are still a bit jerky. Any tips to reduce this?
Overall, I'm having a great time exploring the features of Character Animator. You guys are doing a great job.
Copy link to clipboard
Copied
1+2. Beta puppets are made using a new codebase, so they will not work in lower number versions like the shipping release. I tend to think of them as two separate apps.
3. If possible, can you post a video showing what type of jitter you're seeing? Is it a delay/latency, or shaking artwork, or something else? This is something we are actively trying to fix and seeing your issues would help us a lot.
Copy link to clipboard
Copied
Copy link to clipboard
Copied
This is perfect - thank you so much! Awesome character as well!
Copy link to clipboard
Copied
Thank you Dave!
Copy link to clipboard
Copied
One more thing the team just asked - is it possible to send us a screen recording of CH showing the webcam + character so we can see how the two are interacting? You can DM me the video if you don't want to be seen on camera publicly. If it's too much trouble no worries - it's just that this is an awesome example with a high quality puppet!
Copy link to clipboard
Copied
Here you go Dave. Happy to help.
One more thing, I have been having some issues with the 'Compute Lip Sync with Scene Audio', the app misses a lot of "S", "M" and "F" sounds and assigns the incorrect mouth shapes. I have to go in and correct the errrors. Is there a way to reduce the error rate, by making the audio louder maybe?
Or can you guys add a text recognition function to accompany .wav file Lip Sync to increase accuracy? So you give the app a .wav file and a text transcript of what was said in the .wav file, for instance.
If you need anything else, let me know.....
Copy link to clipboard
Copied
Thanks! I noticed the webcam area seems pretty dark, so I wonder if more lighting might improve the results. I'm actually surprised how decent it's working given the darkness and general lack of contrast between you and the background. If you're able to try adding more lighting, please let us know if that improves things at all or not. I don't think we can reach the same smoothness as dragger yet, but I think that will at least get you a little closer.
Louder, clearer audio tends to get the best results. You can also fool around with the lip sync prefs in the Preferences menu to see more/less visemes at a time. The transcript idea is one we've wanted to implement for a long time, and while I can't announce anything right now, stay tuned... 🙂
Copy link to clipboard
Copied
I've used the webcam in my room with all the lights on and it seems to lose tracking, my room lights are a bit bright. The camera loses sight of my arms because the contrast is poor with all the lights on. Thanks Dave!
Copy link to clipboard
Copied
Body Tracker not working for me. It is seeing my limbs but will not respond to my movements. I've been trying for hours with many different puppets including my own custom build.
Yes I am standing far way and it sees my arms and hands because it's putting sticks on them.
I'm on Apple Silicon chip.
Deeply frustrating. Lipsync working perfectly but body tracker not working. Please help.
Also the PINS don't seem to pin my custom build to the ground. I put about 7 in there.
Copy link to clipboard
Copied
Can you please download and import the Keiko character on adobe.com/go/ch_bodytracker and see if she works? Note for now only those 6 puppets are properly rigged for body tracking, none of the home screen puppets will work by default - you would need to at the very least add the Body behavior to them.
Copy link to clipboard
Copied
Hello. Sorry to post again.
Some of us are not being automatically granted access to beta builds on the Adobe website. Other people in this thread have said they've had this problem. (I'm having this problem, too.)
This is a tech support / website permissions issue. It may be that people who started with a single-app membership are not being granted permsission to access beta builds, even if they now have an all-app membership. This should be quick and easy to fix.
I'll contact tech support, but just mentioning it here in case you can fix it.
Copy link to clipboard
Copied
Thanks for letting us know, I was unaware of this issue. According to https://helpx.adobe.com/x-productkb/global/creative-cloud-public-beta.html, it says beta rollouts may still be happening? If you get a good answer from tech support please let us know!
Copy link to clipboard
Copied
Ok, I contacted tech support and we got it figured out.
You were basically correct in your response to the first poster in this thread.
In order to have access to beta builds, you must first download the Creative Cloud app (Account > All Apps > Available in your plan), which I hadn't done. Then generally it seems to be better to access your Adobe apps through the Creative Cloud desktop app than to access them through your web browser / the Adobe website.
Sorry, I'm not a total idiot; just not a tech person per se. 🙂
I do have to say, I've contacted Adobe tech support twice now for two different apps, and it's excellent and by far the best I've ever seen.
Copy link to clipboard
Copied
hi can you tell me how to edit Puppet bady tracker I can't use the regular Puppet in the beta help please
Copy link to clipboard
Copied
See the Beta video overview at https://youtu.be/pCKMpTE4Cu8 - to enable a regular puppet for body tracking, you need to add Limb IK with the proper tags + the Body behavior.
Copy link to clipboard
Copied
First of all, I took a few puppets I already had worked, and the body tracking JUST WORKED. Adobe just instantly became my favorite company.
Feature request-- multiple-actor tracking. Almost all my scenes have two characters, and if I could do them with my wife, I'd save SO much time.
Great job so far, I can't really suggest any improvements that are needed.
Copy link to clipboard
Copied
First of all, I'm absolutley LOVING this new feature. Instead of making weekly videos for my educational website, with pretty clunky animations (because I'm not a talented animator), I think with practice I'll be able to do about one each day, and with much more natural body language. NICE!
That being said, I have a problem with rigging.
I have some models from a partner company which have multiple views and sets. Specifically, several elements on each model are marked with "Left Wrist" and "Right Wrist" tags.
I discovered through trial and erorr that Body Tracking is only working on one pair of tags-- in my case, it only works when the character is turned to the left 3/4 view.
Is this a known limitation right now? I can create multiple models, one for each view angle, but obviously it would be nice if I didn't have to do that.
Copy link to clipboard
Copied
The Body behavior matches its handles “one per view.” So if you have multiple pairs of Wrist tags, make sure they are contained in separate views. You can tell which behaviors do this for their handle matches by looking at the tooltip in the rig properties panel:
You identify each view by selecting a layer group in Rig mode and using one of the layer tags like Frontal, Left Quarter, or Right Profile. And you’re allowed to have multiple of each, so you could have three Frontal views if you want — they all act as group-ers, and will show up in the Views list above.
Copy link to clipboard
Copied
Aha!
The model I was using had both front and back arm positions in one view (my request to the designer) in the AI model, but the active set actually wasn't listed in the Model hierarchy in Ch. (I'd call this a bug)
I deleted out the extra set in Ai, the active set now appeared in the Model in Ch, and I was able to correctly rig the arms and get the Body Tracking functioning properly.
Thanks so much for the help. I'm supposed to be programming my company's website today, but I think "This site needs MOAR ANIMATIONS" is going to be my mantra this week! 😄
Copy link to clipboard
Copied
I have tried teh body tracker and am very excited - however, if I use it with my laptop, I seem to have to move way way back in order for my whole body to appear in the frame. Is this because of the laptop and camera ? Is there a recommended type of camera? The body tracking for the part of my body visible in the camera works really really well, but when I have to move this far back, its not working well event hough I can see the tracking lines for my arms, legs and head. Would a wide angle camera with a larger monitor work better? what do you recommend?
Copy link to clipboard
Copied
I have used it with two webcams - the built-in one for my Macbook Pro and a Logitech Brio. I have found the Brio a little easier to work with given its wide angle options, which gives me a little more room to move in. All the tutorial video stuff I've shown has been with th Brio. But honestly most of the time I just use the built-in Macbook camera and it seems to work well. But yes, you do have to back up a little more, but the tracking should still be working as long as your body takes up most of the vertical height of the frame. If it isn't, try adjusting the lighting or making sure there is enough contrast between you and your background. If it still persists, send us a video of what the webcam + CH scene looks like and we can take a closer look.
Copy link to clipboard
Copied
Seems to work OK for me.
I need to experiment more I think on getting a good walk in and sit pose.
I also need to experiment with the idea of me walking on the spot but the background moves to simulate that.
Some voice control may also help? That way we can just ask it to record or stop recording.
Good lighting seemed to have solved a few minor movement issues for me.
Copy link to clipboard
Copied
Hi, I have been attempting to build a full body tracking puppet with multiple views for bodies. Tagging each body view with head view tags (,such as Frontal) means body view will be switched alongside head turns.
Although its not natural if the character's head and body always turn at the same time. So I decided to use the swap set to switch between body views instead.
However, the body tracker stops tracking body when I swap it to a view that is not my default view.
CoSA_DaveS's reply to jisooks58063913 is giving me the sense that this is a technical limitation.
If there is a solution, please let me know. I personally think body needs its own tags for different views.
Copy link to clipboard
Copied
I believe you would have to add the Body behavior to each view to get it to work, instead of just one Body behavior at the root. We are also working on some improvements to this that I think you will like — please stay tuned in the coming months.