• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

Mouth not moving when i move my mouth

New Here ,
Sep 04, 2020 Sep 04, 2020

Copy link to clipboard

Copied

I have edited a basic template character to look like a queen ant. Everything was working perfect until the very last step. Now the mouth won't move when I talk or move my mouth on camera. I clicked on the triangle icon for rigging issues and it says "handle tagged as mouth not being used by any behaviors" and for the other layers of the mouth it says "layer tagged as Aa is not being used by any behaviors" Also every other issue says Handle tagged as mouth is not being used by any behaviors. PLEASE HELP! 

TOPICS
Rigging

Views

7.2K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 1 Correct answer

LEGEND , Oct 15, 2020 Oct 15, 2020

Ummm (which is a hint I have never done this before lol!), if I was doing it then I would say "thank you Characterizer, I will take it from here" and take over adding the extra functionality to the puppet. Characterizer has all the stylization etc. If you only have a single image, I would ...

  • take the artwork, I assume its photoshop so I would select the region around the mouth and copy it into a new layer.
  • I would then wipe out the mouth on the face (e.g. smudge the surrounding area, or fill it w
...

Votes

Translate

Translate
Community Expert ,
Sep 10, 2020 Sep 10, 2020

Copy link to clipboard

Copied

It sounds like your puppet doesn't have the Lip Sync behavior applied. At the very top of the Puppet panel in your Rigging workspace, click on the behavior icon next to the puppet's name (in the screenshot it's the brick with a gear icon and the number "9") and add Lip Sync. Once you do the Rigging Issues panel should clear out and you should be able to make your puppet talk.

davidarbor_0-1599741573244.png

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Oct 15, 2020 Oct 15, 2020

Copy link to clipboard

Copied

I have exactly the same issue using latest version 3.3.1 when I use a face I made in characteriser. The whole top of the head moves fine - eyebrows, forehead, eyes but the mouth refuses to open. The red lip sync dot is enabled and in the rig panel there are crowns on the name, character, and mouth boxes as well as the eyes, left and right quaters and the background - I tried Davidarbor's solution in this thread but all that did was to add lip Sync 2 to the properties panel (there was already a lyp sync enabled there). Also when I use any of the pre made puppets that come with CA they all work fine - lip sync and all. This sort of stuff wastes hours and really puts me off - can anyone think what might be wrong? Thanks. 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Oct 15, 2020 Oct 15, 2020

Copy link to clipboard

Copied

I have only used characterizer once. Happy to help try and debug if useful. Eg if you go into the Lipsync behavior it should should james like Mouth and then lots of viseme names like Ah, Uh, etc. Next to these should be a number. If zero, it means the behavior did not find the tagged layers. Behaviors look for tags on layers to control the puppet. The layer name if "Ah" with get auto-tagged with Ah to save effort, but its the tags that matter not the layer name. (you can manually add and remove tags on layers). I am wondering if characterizer is getting it wrong. Could you show a screenshot of the lipsync behavior expanded?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Oct 15, 2020 Oct 15, 2020

Copy link to clipboard

Copied

Here is an example screenshot of what i am after. If you look at "Mouth" you see (pale gray) it found the mouth layer in the artwork. It also found the visemes inside the mouth layer group. This is step 1 - making sure the Lipsync behavior found the layers.

alank99101739_0-1602796728926.png

Note: There is also the Face behavior - if you smile with the camera on, does the puppet smile? The "Smile" and "Surprised" visemes are controlled by the Face behavior based on the webcam. The rest are based on audio processing. So if the lipsync behavior has found the layers, the next thing is to make sure it is getting the audio processed or not. E.g. can you record audio file and play it back and hear your voice?

 

Then if that works, can you use "Compute from Lipsync" to generate visemes in the timeline panel for a scene. But can walk through each step one by one.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Oct 15, 2020 Oct 15, 2020

Copy link to clipboard

Copied

Thanks it looks like the lip sync did not find the layers as all my values are zero in that panel under lipsync - what I do to help it find them?

 

Screenshot 2020-10-15 at 22.27.21.png

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Oct 15, 2020 Oct 15, 2020

Copy link to clipboard

Copied

also you are right none of the smile etc work in the lower face

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Oct 15, 2020 Oct 15, 2020

Copy link to clipboard

Copied

The audio works fine and so do all the pre loaded characters which all lip sync - it is only when I use characterize that lip sync, smile etc. does not work but everything in the top half of the face does work - strange. 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Oct 15, 2020 Oct 15, 2020

Copy link to clipboard

Copied

I have partly worked it out - I used a single still picture from file to characterize rather than my own images captured on my video camera. It is because I wanted to characterize a friend not myself. However, if you do that it does not go through the whole set of samples it takes when you characterise youself on your video camera - a talking guide takes you through all the mouth shapes etc. that you need to do for lip sync - but if you just characterize from a still image on file (an option in the drop down menu)  it cannot get the mouth shapes and their sounds. Characterize does work for lip sync when you use the camera and audio inputs from your own video feed.

 

So the question is  - is there a way to lip sync when characterizing a still image of a friend, not yourself on the video? 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Oct 15, 2020 Oct 15, 2020

Copy link to clipboard

Copied

LATEST

Ummm (which is a hint I have never done this before lol!), if I was doing it then I would say "thank you Characterizer, I will take it from here" and take over adding the extra functionality to the puppet. Characterizer has all the stylization etc. If you only have a single image, I would ...

  • take the artwork, I assume its photoshop so I would select the region around the mouth and copy it into a new layer.
  • I would then wipe out the mouth on the face (e.g. smudge the surrounding area, or fill it with skin color or something - not too important). This does not have to be very accurate as you are going to put the mouth shape for different sounds in front. The problem is "Oh" with a narrow mouth is going to be smaller, so you don't want the old mouth poking out from behind.
  • I would then inside the Head create a Mouth group (is there one already?) and put the copied mouth into that group and call it Neutral.
  • I would then clone the Neutral layer and call it "Ah" and use Photoshop to morph it to look like the mouth open using warps etc. (I am not very good with photoshop, but there are various transforms / warps // distortions you can do - I search YouTube for tutorials on this).
  • Repeat for all the visemes, doing the best you can. It is fine to duplicate some of the shapes for multiple sounds, at least to get going.

 

If you want to just get going for quick fun (I would!!!), import one of the default CH puppets that uses a PSD file and copy the mouth out of it and just use it. You will need to resize and reposition it. (Make sure you un-hide all the mouth child layers before moving and resizing to save yourself a lot of pain.) You can then try to get it to work that way as a crude test so you know Lipsync etc works, then follow the instructions above with more confidence your artistics efforts will not be in vain.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines