• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
1

Feature Focus: Speech-Aware Animation

Adobe Employee ,
Aug 14, 2020 Aug 14, 2020

Copy link to clipboard

Copied

Using the power of Adobe Sensei you can create automatic animation from speech. Character Animator will generate head and eyebrow movements corresponding to a recorded voice.

 

Note: On macOS, 10.15 or later is required.

 

CH SpeechAware1.gif

 

Try out Speech-Aware Animation by importing an audio file and adding it to a scene with your puppet, then choosing Timeline > Compute Head Movement from Speech. A Camera Input take for the puppet's Face and/or Head Turner behavior (whichever ones are currently being used on the puppet) will be created with the automatically generated head movement.

 

Exaggerate or minimize the computed head movement by adjusting the following parameters in the Properties panel:

 

Face behavior:

  • Head Position Strength
  • Head Scale Strength
  • Head Tilt Strength
  • Eyebrow Strength
  • Parallax Strength

 

Head Turner behavior:

  • Sensitivity

 

The computed head movement can still be blended with manual Face and Head Turner recordings for more refinement.

 

Note: The Compute Head Movement from Speech does not change mouth shapes, so you'll want to also use the Timeline > Compute Lip Sync Take from Scene Audio menu command to generate lip sync for your puppet.

 

CH SpeechAware2.gif

 

 

(Use this Beta forum thread to discuss this feature and share your feedback with the Character Animator team and other Beta users. If you encounter a bug while using this feature, let us know by posting a reply here or choosing "Report a bug" from the "Provide feedback" icon in the top-right corner of the app.)

TOPICS
Feedback

Views

9.2K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
Aug 16, 2020 Aug 16, 2020

Copy link to clipboard

Copied

Oh my!!! I really need thisss!!!!! 😄 😄 😄

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
Aug 16, 2020 Aug 16, 2020

Copy link to clipboard

Copied

Is it possible to have the auto head movement from this new feature to move like South Park head movement? I really need it:(

So far, I am doing it manually with keyframe. It's a lot of work for a single animator like me:(

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Aug 19, 2020 Aug 19, 2020

Copy link to clipboard

Copied

I'm not sure which elements of the South Park style you're going for, but here a few thoughts:

 

  • lower the framerate of the scene.  that might help get that staccato feel.  The only tricky part is balancing that with the lip sync.  Too small of a frame rate and you'll lose a lot of visemes.
  • Perhaps try putting a nutcracker jaw behavior on the whole head (you'd have to tag the head as "jaw").  A very high audio snesitivity and very low max position might do the trick (in other words, it doesn't take a lot of volume to get some movement, but it'll never move very much).  it'll probably work better with vertical position movement than rotation.
  • Mix n match.  If you turn the position parameter down to 0, for example, on the face behavior, the auto head movement will only rotate and scale the head, and you can do the up and down with the nutcracker jaw.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
Aug 20, 2020 Aug 20, 2020

Copy link to clipboard

Copied

Hey there! Thank you so much for replying.

 

I haven't tried exactly like you suggested, but I have tried something similiar before. I played with the face behavior and use the webcam to achieve that South Park style head movement, but it wasn't good.

I upload a short cut of my animation on Youtube for you to understand what I mean exactly.

 

https://youtu.be/VpjvdHUAhHs

chracteranimator #simpleanimation #animasi This is a manually keyframed movement. Took a very long time just to animate a few seconds of movement. This video is in 12 fps rate and was entirely animated in Adobe Character Animator. I am sorry I do not provide english subtitle currently because this

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Aug 22, 2020 Aug 22, 2020

Copy link to clipboard

Copied

First day checking out the Beta and so far it looks GREAT ! I look forward to helping in the development. I am a Dinosaur in Computers (Wrote 1st prog in 1978) I have both Windows and Mac Platforms and a little SQA background. Do you anticipate adding .puppet files to Adobe Cloud Libraries as ASSETS  in the future ? I apologize for not RTFM about features yet. Congrats on the amazing product!

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Aug 25, 2020 Aug 25, 2020

Copy link to clipboard

Copied

Not sure about any current plans. I know when you upload a .puppet to Google Drive, it usually sees it as a .zip file and will show its contents (chproj file and additional folders). I think we all hope that .puppet files will be made more standard in the future, and would love to see easier ways for people to browse and access puppets.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Aug 28, 2020 Aug 28, 2020

Copy link to clipboard

Copied

That's awesome! I just tried it and the Pose-to-Pose doesn't seem to to work. Maybe I'm doing something wrong.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Aug 28, 2020 Aug 28, 2020

Copy link to clipboard

Copied

Pose to pose is unfortunately not currently in this version. Is that a dealbreaker for you? We'd love to hear your thoughts why or why not.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Apr 16, 2021 Apr 16, 2021

Copy link to clipboard

Copied

Yes, for me it is a dealbreaker, since i usually do my animations with pose to pose. (I feel like sometimes I am not notified, when someone replies to a thread i am in 🤔)

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Sep 11, 2020 Sep 11, 2020

Copy link to clipboard

Copied

can this be implimented for stream from audio and not camera?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Sep 11, 2020 Sep 11, 2020

Copy link to clipboard

Copied

Currently this only works as a post-processing feature, so it will only work with an existing audio file in your timeline, not live. But if you have a use case where you would want it live and not want the camera, we'd love to hear about it.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Sep 11, 2020 Sep 11, 2020

Copy link to clipboard

Copied

my thinking is that it'd be less for the host machine to do live.  i'm assuming that animating from the live audio would be less cpu than if it's streaming and having to access the webcam and derive the motion from tracking.

 

i use this for live webinars and char animator is a big footprint utilizing 40-50+% cpu on my 24core and thats at sd frame size 15fp while it has to run obs/virutal mixer/vst inserts/zoom/ppt/and the rest.

ive got a 2020nvidia, seems char animator wants half of it and half (at least) of cpu.

 

a short example.  for the live stuff he's on all the time and talks to the participants and host.

https://youtu.be/yG5U6NlUVQE?t=54

 

also live with webcam for head track the puppet seems real confused and can jump wildly due to my glasses and the frequent times it loses face.  i could just make him head drift but its way more personal with the tracked motion.

 

yes, if it can be done; lighten the load using audio for head movement live, no camera or visual tracking=less cpu,

allows for higher frame rate live outputs without impacting other apps running

i found this program and put it directly to use on my web stuff (with one of your free puppets).   its a really simple and easy way to elevate a production - i understand it's way more intricate than how i'm using it but quick and simple to turn vo into a messenger.

 

regards

dave2.0

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Sep 11, 2020 Sep 11, 2020

Copy link to clipboard

Copied

That's great info and a solid use case, thank you so much!

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Mar 16, 2021 Mar 16, 2021

Copy link to clipboard

Copied

The Ch team would like to get some feedback on this Compute Head Movement from Speech. Have you used it? If not, why? What do you think of it? Etc. Please take the survey. You will not be asked for any personal data. The survey takes approximately 2 minutes. And at the end, there is a chance to tell the Ch team anything you want. Thanks! https://www.surveymonkey.com/r/ChComputeHead

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Apr 15, 2021 Apr 15, 2021

Copy link to clipboard

Copied

Hey new to character animate, and i use the lip sync and the auto head movement. i have a podcast so i have to characters in my animation that i auto lip sync and then auto head move. The whole thing is over an hour long, and when i apply the auto head movement to each character its nearly 2 hours to apply (4 in total) but on each character when i play it back it only seems to work for about 11 minutes and then stops. Does this sound right or im i missing something, because if it takes that long just to apply it for 11 minutes of footage it seems extreme.  Any advice would be appreciated.

 

Thanks Carl

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Apr 15, 2021 Apr 15, 2021

Copy link to clipboard

Copied

Hmm, is both the facial/puppet movement stopping or just the audio, or both?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Apr 16, 2021 Apr 16, 2021

Copy link to clipboard

Copied

Hey, it's just the facial and head movements that stop working. The autohead movement does it job for 11 minutes then stops, now it took nearly two hours for one puppet just to even apply this. Now maybe no one is doing it on a scene this long, but as I said it's for a podcast show which is roughly just over an hour long.

 

Also side note if you or anyone can confirm, is the total duration of the scene only 60 minutes?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Apr 19, 2021 Apr 19, 2021

Copy link to clipboard

Copied

I tried a ~20 minute clip and it was fine for me. Yes, it currently does take awhile to process; we are investigating ways to improve the performance.
I'm trying a sequence of clips to get it over an hour to see if I can repro. My tests were on Windows 10. What OS version are you on?

 

Also, just curious... if you use the Compute Lip Sync Take from Scene Audio command, is it also failing at around 11 minutes?

 

If you split the puppet track into separate sections (e.g., 15-minute sections) and processed each section separately, does it go beyond 11 minutes?

 

Yes, our current maximum scene duration is 60 minutes.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Jul 01, 2021 Jul 01, 2021

Copy link to clipboard

Copied

I have the same issue, with clips over 10 minutes. Head and facial movements stop around 8 minutes in and processing lasts longer than back a month or so ago. I love the feature very much and I'd really like to see this fixed. Thanks, you'all! 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Jul 01, 2021 Jul 01, 2021

Copy link to clipboard

Copied

Hmm, just curious if you're getting the same issue with the latest Beta build of Character Animator, which you can access from the "Beta apps" section of the Creative Cloud desktop app.

If you are still getting it with the latest Beta build, can you provide more information about your computer setup, such as OS version number and RAM? Also, if it's an audio file that you are willing to share with us to diagnose, feel free to send a direct message with a shared link via Google Drive, Dropbox, etc. Thanks.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Jun 09, 2021 Jun 09, 2021

Copy link to clipboard

Copied

Hi all... If you've previously tried Speech-Aware Animation (Compute Head Movement from Speech), we've made some notable speed improvements in the latest Character Animator (Beta) builds for Version 4.4 that we'd like for you to try. Let us know if your thoughts on the speed improvements.

 

Note: On some systems, the progress dialog box might stay at or near 100% complete but not seemingly close the progress dialog box for awhile. We're aware of this issue, and will have it addressed in a future Version 4.4 Beta build.

 

Thanks.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jul 10, 2021 Jul 10, 2021

Copy link to clipboard

Copied

This is happening right now - whenever I use the "compute head movement from speech for my puppets, it ALWAYS makes their eyebrows into the "angry" setting (a!). Why?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Jul 12, 2021 Jul 12, 2021

Copy link to clipboard

Copied

Are the eyebrows independent layers (i.e., do they have a crown icon on them so that they can move freely)? I'm wondering if they're being restricted from movement, causing them to warp? I wouldn't think it'd trigger artwork swaps (e.g., an angry artwork layer).

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jul 12, 2021 Jul 12, 2021

Copy link to clipboard

Copied

HiJeff - yes, crowns are on the "eye" group and its layers too. The "angry" eyebrows appear only when the puppet is not talking. When there is speech, the effect seems normal - then back to angry eyebrows when he's done talking. Would that help to see what's going on? Screen video attached... thanks much - Bob

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Resources