• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
2

How to control a puppet using pre-recorded video?

Contributor ,
Sep 20, 2019 Sep 20, 2019

Copy link to clipboard

Copied

Is there really still no way to pre-record a video of someone talking and then feed that into Character Animator to do it's thing, just as it would on a live camera feed?? This seems an absurd ommission, especially since I've seen questions asking for this pretty much since Character Animator was in Beta. Honestly, why would the application care whether the source video is live or a pre-recorded clip?

I need to animate a character based on someone I have limited access to. I need to just go to where they are, record a nice, well-lit, close-up video of them with a decent mic, and then get back to my office and let Character Animator work on it. It seems nuts that this workflow apparently hasn't been considered by the devs.

 

Title edited by: Mod

TOPICS
Feature requests

Views

4.1K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Sep 20, 2019 Sep 20, 2019

Copy link to clipboard

Copied

Hi there,

 

I understand your point. You can control the puppet using behaviours like pre-recorded voice overs.

Check out this article: https://helpx.adobe.com/in/adobe-character-animator/using/behavior.html#Lip_Sync

I'm sure it will help.

 

Let us know if you have any additional question.

 

Thanks,

Shivangi

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Sep 20, 2019 Sep 20, 2019

Copy link to clipboard

Copied

That just looks like a kludge for the audio. It's not even clear from that document whether CH actually attempts to analyse the spoken words or simply reacts to the audio amplitude. Also, it doesn't deal with video *at all*. How does this help? Again, why is this so difficult?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Sep 20, 2019 Sep 20, 2019

Copy link to clipboard

Copied

There is no support for pre-recorded video unfortunately in CH. Yes, it would be really useful - the ability to move the playhead and see the right point in the video file. It would help sync the puppet movements to the video (image and audio). There was a feature request for it you can vote for, but the forums changed around recently - not sure if the voting system changed as well. As Shivangi said you can import an *audio* file (just import it). But not video.

 

However, you can export a video from CH with transparency then use Prem Pro or After Effects to superimpose it over the original video. You can also use “dynamic linking” to include the CH scene directly into Prem Pro / After Effects so updates in CH are seen near immediately in PP/AE. My little computer is a little sluggish doing this, but it might be the closest to what you are after - you should be able to move the playhead in the video editing software and it will show that point of the CH scene superimposed over the top of the video.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Sep 20, 2019 Sep 20, 2019

Copy link to clipboard

Copied

Oh, there is one other option if you want to record live puppet movements against the video. There is NDI streaming support where you can export a video stream out of CH and then import that live stream into something like OBS Studio. OBS Studio supports live streaming, but it can also output to a file on disk. So you play the video (one OBS input stream), CH output is a second stream, the OBS merges and writes it to a file. Its a bit more to set up, but it is another option if you want to do it live for the puppet side of things.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Sep 20, 2019 Sep 20, 2019

Copy link to clipboard

Copied

Thanks, but simply having video in CH or any of the export actions you mention doesn't help. I just want CH to take a video of someone talking, and track their face, sync their voice, and animate the puppet character just as it would with a live video stream. I don't understand why this is apparently so difficult.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Jan 28, 2022 Jan 28, 2022

Copy link to clipboard

Copied

LATEST

I completely agree.

 Did this ever get resolved?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Sep 20, 2019 Sep 20, 2019

Copy link to clipboard

Copied

Oh, and re-reading I think I misunderstood what you are trying to do. Are you trying to use a prerecorded video instead of a live web camera, so face tracking and audio (lipsync) comes from there. My other responses were off target then - Oops! Lol!

 

I do recall someone using some kind of virtual camera ... manycam I think it was called. I have not tried that myself though. Just wondering if it can play a video and make it a video source for the camera input. The NDI suite of tools might have something as well. Just making suggestions if any use, and yes would involve some messing around. Also I notice there is “OBS Virtualcam” which seems to be a plugin for OBS that makes the OBS output a camera source that I assume CH can then watch.

 

Lipsync is from sound spectral analysis (not volume), except “surprised” and “smile” which use the video stream. I don’t know the algorithm but there are too many visemes for it to be audio levels. There is also the nutcracker jaw which uses video instead to work out how far down to move the jaw (but you don’t get the same mouth shapes).

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Sep 30, 2019 Sep 30, 2019

Copy link to clipboard

Copied

Sorry, didn't see this response at the time. Unfortunately OBS VIrtualcam is Windows only, and nothing else I've been able to find works on Mojave, but I appreciate you taking the time to add your suggestions!

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Nov 19, 2020 Nov 19, 2020

Copy link to clipboard

Copied

I added a suggestion in Adobe User Voice (Wish list) for future updates.

https://adobe-video.uservoice.com/

 

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Mar 14, 2021 Mar 14, 2021

Copy link to clipboard

Copied

Please watch this video and jumpstart to 2.25min. https://www.youtube.com/watch?v=FgUToTwnfYM&t=229s

Idk whether this will work but it will be a help.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Mar 14, 2021 Mar 14, 2021

Copy link to clipboard

Copied

Oh! I'm sorry I didn't see it. This is explained already by alank99101739.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines