• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

Computing lip sync innacurate?

New Here ,
Oct 26, 2017 Oct 26, 2017

Copy link to clipboard

Copied

I've started using Character Animator almost every few days now. When the lip sync works well, it really nails it, but there are times that it really looks different from what should be coming out of my characters mouth. I there any way to help the program learn how you personally speak? Or updates where it will become adaptive and learn your voice?

I'm using CA for YouTube videos, and it would cut out a lot of time if I didn't have to edit the mouth quite so much.

Views

264

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Oct 26, 2017 Oct 26, 2017

Copy link to clipboard

Copied

LATEST

We are still working on improving the accuracy of lip sync. The biggest factor for accuracy is the quality of the audio signal — not too quiet, and not over-driven. You can adjust microphone input levels in your operating system settings. Ideally the green levels meter in Ch should show a lot of green, and never red.

Reducing background noise by turning off fans / air-conditioning can also help.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines