Skip to main content
alank99101739
Legend
September 25, 2018
Answered

MIDI for controlling Face and draggers

  • September 25, 2018
  • 1 reply
  • 5102 views

Has anyone experimented much with MIDI beyond an alternative for keyboard triggers? In particular, can they control draggers or the face angle? I came across Using Character Animator With Hot Hand Wireless MIDI Controller - YouTube which was pretty nice. Seems to support X/Y/Z spatial positioning for I think it was $50? I would love one on each hand to drag the character hands around.

I was looking at the HTC Vive API (same sort of thing, but much more expensive a bulky) - looks fairly easy to get position tracking of the two hand controllers, which could be a nice way to control puppet hands (via draggers). But same problem, I don’t see how to feed that into CH to control Draggers.

    This topic has been closed for replies.
    Correct answer KJerryK

    Thanks Jerry. Interesting idea, but yes I would really prefer on draggers, especially with the v2.0 features around fixing double jointed arms.

    Can you control draggers from inside AE I wonder... It seems to be more extensible...


    I made another stab at using the x,y transform parameters, and here is what I got. So the Hot Hands could actually be worth a look, as using human arms might provide realistic physical limits to the motion. As for using midi to control animation in After Effects, there has been a lot of work done on that, but I haven't yet tried it myself, just watched some tutorials.

    1 reply

    KJerryK
    Legend
    September 25, 2018

    Hi Alan,

    That might be interesting to look at. But just using a midi device -- hardware or virtual -- provides fader controls along with key strokes. So any behavior that can be changed uniformly can be assigned a midi fader. This includes all the transform behaviors, most of the physics behaviors, face behaviors, and eye gaze. Also the midi fader controls can be recorded in a *.mid file to be played whenever needed. But, as you said with regard to the dragger, how to use midi there. The best I've been able to come up with is to assign a transform behavior to, say, just the arm, rig it appropriately, and then "drag" the arm using the rotation transform triggered by a midi fader control. That works okay, but my brief attempt looks a lot stiffer than using the dragger.

    alank99101739
    Legend
    September 25, 2018

    From what I could see of the Face behavior, you can put slides on lots of controls, but not the actual head rotation or forward/back - that can only be fed in via webcam that I can see. (Am I missing something?)

    Also I could not see draggers able to attach to MIDI signals, so I could not see how draggers could be controlled by MIDI sliders. That is the one that I think would be really cool - hook the X/Y of the "Hot Hand wireless midi controller" to a dragger and I can control hand positions of the puppet just moving my own hands. Sounds really useful for live streaming - using the mouse is a bit of a pain (and can only do one hand at a time).

    (For recordings, I still want my camera panning support however. Then I could record a long scene and chop it up afterwards (zoom in/out with camera, layer in some other scenes for variety etc). You can do it at present by adjusting zoom of everything by hand, but its a bit painful.)

    KJerryK
    Legend
    October 1, 2018

    Interesting. Thanks. I will give that a try as well.

    I actually, as I think more about it, maybe I don’t mind getting hand rotation from the controllers - it gives even more expressivity with the twist of a wrist.

    I was almost wondering about using wrist rotation to also control the hand guesture to display, but not sure about that yet. E.g. front of hand vs back of hand might make sense because it is natural. (Raised back of hand = punching fist, raised front of hand = waving.) I think trigger for how many fingers open still makes sense. E.g. no trigger = open hand, 1/3rd trigger pull = a bit open (neutral), 2/3 = pointing one finger, full pull trigger = closed hand (fist); then rotation = back of hand vs palm of hand (Possibly with side views too?).

    Need to not go too crazy in number of perspectives that need to be individually drawn.

    Hmmm. Would you do the whole arm - so you can draw a foreshortened arm pointing forwards more naturally (Instead of elbow shooting out sideways). E.g. use Z-distance (how close controller is to the screen) to recognize a pointing gesture...  You could have a field day here if you wanted. (But in that case, why not just use a real 3D model...)


    Using real 3D would in some cases be my option if/when it can be implemented, depending on the project. But not always. I have tried using Fuse models to simulate 3D, but it gets complicated in a 2D setting. Just like using mocap techniques get more complicated in 2D when trying to make it look like 3D. Takes a lot of time, and there's only so much of that.