Skip to main content
ArinAnimates
Participant
October 27, 2017
Question

Computing lip sync innacurate?

  • October 27, 2017
  • 1 reply
  • 332 views

I've started using Character Animator almost every few days now. When the lip sync works well, it really nails it, but there are times that it really looks different from what should be coming out of my characters mouth. I there any way to help the program learn how you personally speak? Or updates where it will become adaptive and learn your voice?

I'm using CA for YouTube videos, and it would cut out a lot of time if I didn't have to edit the mouth quite so much.

This topic has been closed for replies.

1 reply

CoSA_DaveS
Adobe Employee
Adobe Employee
October 27, 2017

We are still working on improving the accuracy of lip sync. The biggest factor for accuracy is the quality of the audio signal — not too quiet, and not over-driven. You can adjust microphone input levels in your operating system settings. Ideally the green levels meter in Ch should show a lot of green, and never red.

Reducing background noise by turning off fans / air-conditioning can also help.