• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
1

What is the required set-up to livestream with more than 1 character?

New Here ,
Apr 13, 2020 Apr 13, 2020

Copy link to clipboard

Copied

Hello - Is there a native way to livestream more than 1 character via CH or is the solution to use a dedicated PC for each character then produce them together via a mixer such as vMix?

I haven't tried it yet but I've seen it done live here: https://youtu.be/7li_rEQ4TL4

 

TOPICS
How to , Tips and tricks

Views

427

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Apr 13, 2020 Apr 13, 2020

Copy link to clipboard

Copied

I really need multiple live characters as well for streaming. The ability to take multiple audio/video inputs from something like Skype and apply them to puppets would be invaluable.

 

Short of that, I need a way to swap active puppets with the keyboard or a midi device. That way I could at least get lip sync working by quickly swapping between characters (assuming they are not talking over each other).

 

Unless I'm missing something, neither of these is possible right now without something like vMix, which is not an option for me.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Apr 13, 2020 Apr 13, 2020

Copy link to clipboard

Copied

Are two people going to be talking and controlling the two characters? Are you going to have two microphones? Do you need two mice to control the two puppets? Etc. Two cameras for head positions? Explaining a bit more would be helpful.

 

Normally you have two computers because you need all of those controls, depending on what you are a trying to achieve.

 

You can switch between the puppets, you can play pre-recorded "replays". You can also do clever tricks like put two characters in the same "puppet" then use triggers to show one mouth or the other, so one or the other talks. But without knowing the exact effect you are after its a bit hard to make suggestions.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Apr 13, 2020 Apr 13, 2020

Copy link to clipboard

Copied

Just to be clear, the most natural way today is to have two computers running the software, each generating their own NDI stream. You then use vMix, OBS Studio, etc to mix the streams into a single stream to send to YouTube/Twitch etc.

 

There was a beta around somewhere a while back for software to use (but not create) a puppet. But I could not find it recently and did not try it myself. That would avoid the need to buy multiple copies of the software to run and would make it easier for each streamer to run that "puppet player" code locally.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Apr 13, 2020 Apr 13, 2020

Copy link to clipboard

Copied

Thanks for the reply. My main goal is to have all the puppets allow for live interaction via their transforms, triggers, and physics. I want to be able to move the puppets around the scene as if it was a puppet show stage, as well as allow for triggered camera panning and zooming. My understanding is that the NDI-vMix solution isn't suited for this; it seems like it works for more static content like talk shows.

 

My setup would consist of one computer, one local user, and up to three other actors via a video webchat like Skype.

 

The local user would talk and use headtracking via local microphone and webcam.

 

This local user would also manage any and all triggers for all of the other characters in addition to their own. This means the local user would be able to trigger things like emotes (consisting of triggered replays) for the other characters — this is a workaround because I don't think there's a way to take live trigger input from remote users in CH. If there is a solution for this I'd love to know.

 

The other Skype actors would feed live headtracking and lip sync data to CH via their live Skype audio/video.

 

Any ideas? I'm happy to tweak the setup if there's a good solution to achieve the level of interaction I want.

 

EDIT: Oh, and it seems like a key limitation with the approach I've detailed here is that only one puppet at a time would support live headtracking and lip sync (assuming I do not want all the puppets to have the same headtracking and lip sync simultaneously). It seems that the local user would have to swap the active puppet based on who is talking.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Apr 13, 2020 Apr 13, 2020

Copy link to clipboard

Copied

Thanks for the reply. My main goal is to have all the puppets allow for live interaction via their transforms, triggers, and physics. I want to be able to move the puppets around the scene as if it was a puppet show stage, as well as allow for triggered camera panning and zooming. My understanding is that the NDI-vMix solution isn't suited for this; it seems like it works for more static content like talk shows.

 

My setup would consist of one computer, one local user, and up to three other actors via a video webchat like Skype.

 

The local user would talk and use headtracking via local microphone and webcam.

 

This local user would also manage any and all triggers for all of the other characters in addition to their own. This means the local user would be able to trigger things like emotes (consisting of triggered replays) for the other characters — this is a workaround because I don't think there's a way to take live trigger input from remote users in CH. If there is a solution for this I'd love to know.

 

The other Skype actors would feed live headtracking and lip sync data to CH via their live Skype audio/video.

 

Any ideas? I'm happy to tweak the setup if there's a good solution to achieve the level of interaction I want.

 

EDIT: Oh, and it seems like a key limitation with the approach I've detailed here is that only one puppet at a time would support live headtracking and lip sync (assuming I do not want all the puppets to have the same headtracking and lip sync simultaneously). It seems that the local user would have to swap the active puppet based on who is talking.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Apr 13, 2020 Apr 13, 2020

Copy link to clipboard

Copied

That sounds hard to do live. You would have to flip cameras and puppets in sync. Not impossible, but sounds hard. E.g. you have to flip camera signal and currently active puppet. Flipping camera takes a second or two in the menus.

 

I also wonder if one PC could handle the load - its pretty CPU intensive. Might not keep up.

 

Can someone from Adobe comment on a puppet player? I thought I saw a beta around somewhere, but never tried it. I cannot find it now. That would seems the best way to go if supported. Each person generates a Skype video stream that is the character. Can greenscreens or simliar the background when mixing etc.

 

I think it depends on how keen you are. It certainly is not common. Not necessarily impossible, but not simple. E.g. need to disable puppet, swap camera, reset pose, enable new puppet each time there was a new character meant to start acting.

 

Moving around the scene too is a bit tricky.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Apr 13, 2020 Apr 13, 2020

Copy link to clipboard

Copied

Makes sense. I would forgo the headtracking if I could just swap active lip sync among the puppets live. I could take any audio input coming to my system and just manually swap them with triggers, assuming only one person is talking at a time. It doesn't have to be perfect.

 

Is there a way to create a swap set to arm/disarm lip sync among multiple puppets live?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Apr 14, 2020 Apr 14, 2020

Copy link to clipboard

Copied

I have not tried it, but you can arm multiple puppets at once... how about binding the same key in multiple puppets. Then have the Mouth in a swapset with a closed mouth (a single layer). So by default puppet one shows the mouth with visemes and puppet 2 shows the closed mouth; then when you press the trigger key puppet one swaps to a closed mouth and puppet 2 swaps to a viseme mouth? Both puppets fire with the same key stroke. You just arm both puppets at once.

 

I think that should work...

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Apr 14, 2020 Apr 14, 2020

Copy link to clipboard

Copied

LATEST

That is, both get the keystrokes, audio, etc. Its just using swapsets only one puppet will show a working mouth at a time. The other puppet replaces the mouth controlled by audio with a static image. Because the viseme mouth is hidden, you wont see anything even though the puppet is receiving the audio file.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines