Has anyone experimented much with MIDI beyond an alternative for keyboard triggers? In particular, can they control draggers or the face angle? I came across Using Character Animator With Hot Hand Wireless MIDI Controller - YouTube which was pretty nice. Seems to support X/Y/Z spatial positioning for I think it was $50? I would love one on each hand to drag the character hands around.
I was looking at the HTC Vive API (same sort of thing, but much more expensive a bulky) - looks fairly easy to get position tracking of the two hand controllers, which could be a nice way to control puppet hands (via draggers). But same problem, I don’t see how to feed that into CH to control Draggers.
That might be interesting to look at. But just using a midi device -- hardware or virtual -- provides fader controls along with key strokes. So any behavior that can be changed uniformly can be assigned a midi fader. This includes all the transform behaviors, most of the physics behaviors, face behaviors, and eye gaze. Also the midi fader controls can be recorded in a *.mid file to be played whenever needed. But, as you said with regard to the dragger, how to use midi there. The best I've been able to come up with is to assign a transform behavior to, say, just the arm, rig it appropriately, and then "drag" the arm using the rotation transform triggered by a midi fader control. That works okay, but my brief attempt looks a lot stiffer than using the dragger.
From what I could see of the Face behavior, you can put slides on lots of controls, but not the actual head rotation or forward/back - that can only be fed in via webcam that I can see. (Am I missing something?)
Also I could not see draggers able to attach to MIDI signals, so I could not see how draggers could be controlled by MIDI sliders. That is the one that I think would be really cool - hook the X/Y of the "Hot Hand wireless midi controller" to a dragger and I can control hand positions of the puppet just moving my own hands. Sounds really useful for live streaming - using the mouse is a bit of a pain (and can only do one hand at a time).
(For recordings, I still want my camera panning support however. Then I could record a long scene and chop it up afterwards (zoom in/out with camera, layer in some other scenes for variety etc). You can do it at present by adjusting zoom of everything by hand, but its a bit painful.)
I don't see any way to use midi to control the dragger directly. Fiddling with the transform behavior, such as adding it to the hand instead of the arm, and then varying the x and y positions of the hand with midi to simulate dragging doesn't look too bad, but it lacks the flexibility of the dragger and doesn't seem worth the effort.
Thanks Jerry. Interesting idea, but yes I would really prefer on draggers, especially with the v2.0 features around fixing double jointed arms.
Can you control draggers from inside AE I wonder... It seems to be more extensible...
I made another stab at using the x,y transform parameters, and here is what I got. So the Hot Hands could actually be worth a look, as using human arms might provide realistic physical limits to the motion. As for using midi to control animation in After Effects, there has been a lot of work done on that, but I haven't yet tried it myself, just watched some tutorials.
Wow! That is better than I thought it would be! That does look interesting! Thanks for giving it a go.
That is just so funky! I wish I knew about this one earlier. Interesting that Transposing X/Y does not adjust rotation... E.g. the twisted hand problem I was having - I could have fixed using "rotation" perhaps to set the initial angle. And messing with Scale is just cool! I had always assumed that transposing would move the hand relative to the rest of the arm (causing it to be disconnected) - which apparently is not the case. Anyone Want a throbbing thumb after hitting with a hammer? Just animate the Transform on a thumb Layer!
The following I put a Transform behavior on a hand and changed Scale. Note how it is still connected to the arm!
Did you come across anything describing which MIDI events CH uses? Notes for triggers seem pretty simple, but what MIDI events control sliders? There is a programmatic API on Windows to generate MIDI events, but not sure which ones to generate for slider controls in CH (such as X, Y, Rotation).
I suspect that there may need to be a bit of maths transposition code between movement devices and CH to get rotation etc correct.
Exploring MIDI more, wrote up a blog here of initial findings. (Including here in case anyone stumbles upon this thread in the future.) My next step is trying to get HTC Vive (VR headset) positional controllers to generate MIDI events to feed into Character Animator. Needs a bit of code, but seems doable... They have lots of buttons and things which are nice for triggers, as well as allowing dragging both hands at the same time (and legs if you get more body trackers) for more natural body position movement data. Full MoCap control in CH - could be cool!
Oooh! I just got the trigger button on the HTC controller to trigger by Character Animator character! And I can fetch position data... But I now how to work out how to convert a vector of 12 numbers into X/Y coordinates for CH to use. Ugg. Maths!
And just got x and y worked out. But because it’s a continuous stream of x/y events from two controllers, it’s hard to bind the midi to the right controls. I think I need a calibration / setup mode for the program...
First cut at somewhat working code is now in GitHub - alankent/htc-vive-midi: Turns HTC Vive controller positional data into MIDI events, for int... MIT License. Needs more testing, but calling it a night tonight. Anyone who is a better C# programmer than I is welcome to help clean it up - particularly to put a GUI on the front instead of relying on command line arguments. Next step is an end-to-end demo...
I don't have everything calibrated etc, but this video was done in a single take using HTC Vive controllers to control both arms simultaneously, and buttons to cause various facial expressions. The biggest issue is the hands do not rotate - but I am not sure if anything is changing here in Character Animator v2.0, so I might pause for a bit until that is available before doing more on this project.
This. Is. Awesome.
Great exploring, Alan. Opens a whole new bunch of possibilities.
I've been looking at the hand distortion issue, and came up with a rigging that seems to work better, at least in this case. Here is the rigging and an example. The transform behavior for moving the left arm is on the pointer hand.
Interesting. Thanks. I will give that a try as well.
I actually, as I think more about it, maybe I don’t mind getting hand rotation from the controllers - it gives even more expressivity with the twist of a wrist.
I was almost wondering about using wrist rotation to also control the hand guesture to display, but not sure about that yet. E.g. front of hand vs back of hand might make sense because it is natural. (Raised back of hand = punching fist, raised front of hand = waving.) I think trigger for how many fingers open still makes sense. E.g. no trigger = open hand, 1/3rd trigger pull = a bit open (neutral), 2/3 = pointing one finger, full pull trigger = closed hand (fist); then rotation = back of hand vs palm of hand (Possibly with side views too?).
Need to not go too crazy in number of perspectives that need to be individually drawn.
Hmmm. Would you do the whole arm - so you can draw a foreshortened arm pointing forwards more naturally (Instead of elbow shooting out sideways). E.g. use Z-distance (how close controller is to the screen) to recognize a pointing gesture... You could have a field day here if you wanted. (But in that case, why not just use a real 3D model...)
Using real 3D would in some cases be my option if/when it can be implemented, depending on the project. But not always. I have tried using Fuse models to simulate 3D, but it gets complicated in a 2D setting. Just like using mocap techniques get more complicated in 2D when trying to make it look like 3D. Takes a lot of time, and there's only so much of that.
New, longer demo video. Working reasonably well now. I have to keep the update rate down as CH can fail to consume all the MIDI events (e.g. 24 controllers updates per second and it does not keep up - but around 8 seems to work. But if webcam is on, then need to drop that to like 4 position updates per second. Could be just my laptop is too slow.)
Silly me - I just discovered I had sticks through the wrists in the puppet! Fixing them up, the puppet can now also wave (flexing at the wrists). E.g. see how the hands are staying straight up and down as the two hands pump up and down, flexing at the wrists.
You just move your hands normally, it it picks up whether to show a side view, rear view, palm view of hands, wrist angles, etc. The controller picks up all the rotational information automatically.
More tinkering. I attached the headset to control a "dragger" on the neck plus rotation of the head. Getting the calibration of everything is still a big problem - I need to do some more thinking there. Also I want to try increasing the sample rate and see how much CH can keep up (I optimized the code to reduce the number of position events being sent).
But latest little test video showing wrist movements, head movements, and proving beyond any doubt that I have no idea how to dance.
That looks great. I don't see any distortion. What happens if you move your body to the side, does the puppet follow?
I was moving the body too. I put a transform on the neck so the top of the torso moves side to side. but if I put too much amplification of body movements it was hard to keep the hands and body in sync. And moving arms also moves body regardless. So it’s happening, but a bit subtle.
oh, I had the feet pinned to stop the body rotating. I could trasform the body instead. It’s just a matter of deciding what works best.
i just read this whole thread. Im very excited to see whats upcoming for adobe CH. Im a new member to the CH community. Ive been watching numerous videos on CH from Dave (okaysamurai) and he is a genius. Thx to the team at adobe, this is what ive created.