Highlighted

Feature Focus: Speech-Aware Animation

Adobe Employee ,
Aug 14, 2020

Copy link to clipboard

Copied

Using the power of Adobe Sensei you can create automatic animation from speech. Character Animator will generate head and eyebrow movements corresponding to a recorded voice.

 

Note: On macOS, 10.15 or later is required.

 

CH SpeechAware1.gif

 

Try out Speech-Aware Animation by importing an audio file and adding it to a scene with your puppet, then choosing Timeline > Compute Head Movement from Speech. A Camera Input take for the puppet's Face and/or Head Turner behavior (whichever ones are currently being used on the puppet) will be created with the automatically generated head movement.

 

Exaggerate or minimize the computed head movement by adjusting the following parameters in the Properties panel:

 

Face behavior:

  • Head Position Strength
  • Head Scale Strength
  • Head Tilt Strength
  • Eyebrow Strength
  • Parallax Strength

 

Head Turner behavior:

  • Sensitivity

 

The computed head movement can still be blended with manual Face and Head Turner recordings for more refinement.

 

Note: The Compute Head Movement from Speech does not change mouth shapes, so you'll want to also use the Timeline > Compute Lip Sync Take from Scene Audio menu command to generate lip sync for your puppet.

 

CH SpeechAware2.gif

 

 

(Use this Beta forum thread to discuss this feature and share your feedback with the Character Animator team and other Beta users. If you encounter a bug while using this feature, let us know by posting a reply here or choosing "Report a bug" from the "Provide feedback" icon in the top-right corner of the app.)

TOPICS
Feedback

Views

855

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more

Feature Focus: Speech-Aware Animation

Adobe Employee ,
Aug 14, 2020

Copy link to clipboard

Copied

Using the power of Adobe Sensei you can create automatic animation from speech. Character Animator will generate head and eyebrow movements corresponding to a recorded voice.

 

Note: On macOS, 10.15 or later is required.

 

CH SpeechAware1.gif

 

Try out Speech-Aware Animation by importing an audio file and adding it to a scene with your puppet, then choosing Timeline > Compute Head Movement from Speech. A Camera Input take for the puppet's Face and/or Head Turner behavior (whichever ones are currently being used on the puppet) will be created with the automatically generated head movement.

 

Exaggerate or minimize the computed head movement by adjusting the following parameters in the Properties panel:

 

Face behavior:

  • Head Position Strength
  • Head Scale Strength
  • Head Tilt Strength
  • Eyebrow Strength
  • Parallax Strength

 

Head Turner behavior:

  • Sensitivity

 

The computed head movement can still be blended with manual Face and Head Turner recordings for more refinement.

 

Note: The Compute Head Movement from Speech does not change mouth shapes, so you'll want to also use the Timeline > Compute Lip Sync Take from Scene Audio menu command to generate lip sync for your puppet.

 

CH SpeechAware2.gif

 

 

(Use this Beta forum thread to discuss this feature and share your feedback with the Character Animator team and other Beta users. If you encounter a bug while using this feature, let us know by posting a reply here or choosing "Report a bug" from the "Provide feedback" icon in the top-right corner of the app.)

TOPICS
Feedback

Views

856

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Aug 14, 2020 1
Contributor ,
Aug 16, 2020

Copy link to clipboard

Copied

Oh my!!! I really need thisss!!!!! 😄 😄 😄

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Reply
Loading...
Aug 16, 2020 0
Contributor ,
Aug 16, 2020

Copy link to clipboard

Copied

Is it possible to have the auto head movement from this new feature to move like South Park head movement? I really need it:(

So far, I am doing it manually with keyframe. It's a lot of work for a single animator like me:(

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Reply
Loading...
Aug 16, 2020 0
Explorer ,
Aug 19, 2020

Copy link to clipboard

Copied

I'm not sure which elements of the South Park style you're going for, but here a few thoughts:

 

  • lower the framerate of the scene.  that might help get that staccato feel.  The only tricky part is balancing that with the lip sync.  Too small of a frame rate and you'll lose a lot of visemes.
  • Perhaps try putting a nutcracker jaw behavior on the whole head (you'd have to tag the head as "jaw").  A very high audio snesitivity and very low max position might do the trick (in other words, it doesn't take a lot of volume to get some movement, but it'll never move very much).  it'll probably work better with vertical position movement than rotation.
  • Mix n match.  If you turn the position parameter down to 0, for example, on the face behavior, the auto head movement will only rotate and scale the head, and you can do the up and down with the nutcracker jaw.

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Reply
Loading...
Aug 19, 2020 0
Contributor ,
Aug 20, 2020

Copy link to clipboard

Copied

Hey there! Thank you so much for replying.

 

I haven't tried exactly like you suggested, but I have tried something similiar before. I played with the face behavior and use the webcam to achieve that South Park style head movement, but it wasn't good.

I upload a short cut of my animation on Youtube for you to understand what I mean exactly.

 

https://youtu.be/VpjvdHUAhHs

chracteranimator #simpleanimation #animasi This is a manually keyframed movement. Took a very long time just to animate a few seconds of movement. This vide...

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Reply
Loading...
Aug 20, 2020 0
Community Beginner ,
Aug 22, 2020

Copy link to clipboard

Copied

First day checking out the Beta and so far it looks GREAT ! I look forward to helping in the development. I am a Dinosaur in Computers (Wrote 1st prog in 1978) I have both Windows and Mac Platforms and a little SQA background. Do you anticipate adding .puppet files to Adobe Cloud Libraries as ASSETS  in the future ? I apologize for not RTFM about features yet. Congrats on the amazing product!

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Reply
Loading...
Aug 22, 2020 0
Adobe Employee ,
Aug 25, 2020

Copy link to clipboard

Copied

Not sure about any current plans. I know when you upload a .puppet to Google Drive, it usually sees it as a .zip file and will show its contents (chproj file and additional folders). I think we all hope that .puppet files will be made more standard in the future, and would love to see easier ways for people to browse and access puppets.

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Reply
Loading...
Aug 25, 2020 1
Explorer ,
Aug 28, 2020

Copy link to clipboard

Copied

That's awesome! I just tried it and the Pose-to-Pose doesn't seem to to work. Maybe I'm doing something wrong.

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Reply
Loading...
Aug 28, 2020 0
Adobe Employee ,
Aug 28, 2020

Copy link to clipboard

Copied

Pose to pose is unfortunately not currently in this version. Is that a dealbreaker for you? We'd love to hear your thoughts why or why not.

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Reply
Loading...
Aug 28, 2020 0
Community Beginner ,
Sep 11, 2020

Copy link to clipboard

Copied

can this be implimented for stream from audio and not camera?

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Reply
Loading...
Sep 11, 2020 0
Adobe Employee ,
Sep 11, 2020

Copy link to clipboard

Copied

Currently this only works as a post-processing feature, so it will only work with an existing audio file in your timeline, not live. But if you have a use case where you would want it live and not want the camera, we'd love to hear about it.

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Reply
Loading...
Sep 11, 2020 0
Community Beginner ,
Sep 11, 2020

Copy link to clipboard

Copied

my thinking is that it'd be less for the host machine to do live.  i'm assuming that animating from the live audio would be less cpu than if it's streaming and having to access the webcam and derive the motion from tracking.

 

i use this for live webinars and char animator is a big footprint utilizing 40-50+% cpu on my 24core and thats at sd frame size 15fp while it has to run obs/virutal mixer/vst inserts/zoom/ppt/and the rest.

ive got a 2020nvidia, seems char animator wants half of it and half (at least) of cpu.

 

a short example.  for the live stuff he's on all the time and talks to the participants and host.

https://youtu.be/yG5U6NlUVQE?t=54

 

also live with webcam for head track the puppet seems real confused and can jump wildly due to my glasses and the frequent times it loses face.  i could just make him head drift but its way more personal with the tracked motion.

 

yes, if it can be done; lighten the load using audio for head movement live, no camera or visual tracking=less cpu,

allows for higher frame rate live outputs without impacting other apps running

i found this program and put it directly to use on my web stuff (with one of your free puppets).   its a really simple and easy way to elevate a production - i understand it's way more intricate than how i'm using it but quick and simple to turn vo into a messenger.

 

regards

dave2.0

Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube.

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Reply
Loading...
Sep 11, 2020 0
Adobe Employee ,
Sep 11, 2020

Copy link to clipboard

Copied

That's great info and a solid use case, thank you so much!

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Reply
Loading...
Sep 11, 2020 0