Tag Archives: MoCap

handMade hand track nulls

Script in progress: a Leap Motion recorder for After Effects

I’ve been building a tool for recording 3D motion tracks, camera moves and hand gestures direct to the After Effects timeline with the Leap Motion.

There’s still a ways to go yet on development. I have to create the UI and tweak the hand puppet rig (and, um, make the hand tracker work with right hands…), but I think this is going to be a useful addon for both literal hand-animation stuff like UI demos and pre-animating gestures for Character Animator puppets, as well as more subtle things like organic camera shake.

The Leap is quite precise in its data — you can even tell different people’s hand tracks apart.


Shaky camera move:

I’m thinking about calling it handMade. Good name?

Keep your head on: Automating scaling in KinectToPin with Z-axis data

I’ve spent a lot of time recently creating the latest version of KinectToPin’s UI Panel for After Effects. It has a ton of great new features, and makes things a lot easier to use.

But now that that’s out, I’m working on something new that gets around one of the biggest remaining issues with rigging 2.5D Kinect characters: automating layer scaling based on Z-distance. It’s one of the most annoying things to deal with, and until now the best options were “stay in one depth plane” or “manually scale things up and down.” Ugh.

The guy on the left is what happened if you walked back and forth toward the camera and didn’t account for it:

KinectToPin - AutoZ 1

KinectToPin - AutoZ 2

The little expression I came up with this morning turns the same character with the same mocap data into the guy on the right.

Keep in mind this is an experimental feature and at the moment only works for camera-facing characters. It won’t be added to the UI Panel until I’ve worked out the necessary layer space transforms and a couple bugs. In the meantime, if you’d like to try it out, do the following: The new code will be added to the UI Panel shortly, but if you’re eager to try it out, here it is:

In the 3D template, set this as the “mocap” layer’s position expression:

mocap = thisLayer;
try{cam = thisComp.activeCamera;}catch(err){ cam = mocap};
torso = mocap.effect("torso")("3D Point");
tW = mocap.toWorld(torso);
fW = cam.fromWorld(tW);

I swear, it seems like the main thing I’ve been doing for the last year and a half is finding ways to make people’s heads stop flying off. This is yet another.


The new KinectToPin is finally here!

After a lot of work (and not a lot of sleep), the all-new KinectToPin has arrived. It’s Kinect motion capture for After Effects in one convenient package! Well, actually, two convenient packages: KinectToPin now includes both a standalone capturing app and an After Effects UI Panel plugin. It’s also about a million times easier to use, and has a ton of new features — automatic setup and rigging (no more coding), direct XML import, 3D features, audio playback during capture… I’m tempted to quote the RUCKUS NYC Kickstarter infomercial and say “And it just! Keeps! Going!”

It also has a snazzy new website all to itself: kinecttopin.fox-gieg.com. Download it and give it a try, I’d love to see what you can do with it!

YouTube Preview Image

Testing the new KinectToPin After Effects UI panel

Along with a bunch of other cool features, the next release of KinectToPin is going to include both 2D and 3D automatic template setup. Here are two quick experiments with the 3D version:

YouTube Preview Image


YouTube Preview Image

I really like the foreshortening. It was actually a bit of a happy accident, coding-wise, an unexpected side effect of the expression that ties the scale of the 2D puppet layers to the 3D camera’s position.

Actually Happening

“Machine Politics” — Actually Happening animated with Kinect motion capture

It’s finally here! Our first animated episode of Actually Happening!


The production process:

Instead of keyframing the characters by hand to match the existing audio track, “Machine Politics” was created using Kinect motion capture. I think the visuals add a lot to what was already one of my favorite bits we’ve ever done. There are a few spots where it gets a tiny bit uncanny valley, but there are also moments that really freak me out with how natural-looking they are. Kevin’s facial expressions in particular just… look like Kevin. It also looks like a real live panel game! And using the traditional panel show format lets us get around some of the serious limitations to this technique (i.e. characters can only face forward).

The skeletal tracking data was captured in Processing with KinectToPin running SimpleOpenNI, then applied to multi-layered puppets rigged in After Effects. Facial animation was done with a combination of automatic lip sync to audio waveforms and a couple of Motion Sketch nulls controlling smile-vs-frown and eyebrow height. I then switched between the 10 different camera angles (all using the same puppet precomps as sources) by bringing them into Premiere via Dynamic Link and creating nested multi-cam sequences.

All that meant there was a lot of asking software to do things it wasn’t meant to do, and I spent a good bit of the animation process going “I can’t believe this is actually working…” There were a few hiccups, though, and lessons learned for the future: next time I’m going to do the sequence edit before I add the Kinect and lip sync data — there were close to a million keyframes in the project at that point and Premiere really started to choke. But now that everything’s rigged and ready to go, I could probably turn around a new episode in a single day. Which, for several minutes of full-color, full-motion animation, is insane.

Things are about to get interesting.

KinectToPin FAQ and Installation Guide


This FAQ’s a bit out of date. Check out the new KinectToPin website for the latest version of the software and how to use it — it’s dramatically easier now.


It’s been pretty incredible seeing KinectToPin generate interest all over the world, but I’ve also had a lot of feedback about how difficult and frustrating it is to get it working. One of my big priorities right now is to find ways to make that easier. But in the meantime, here’s some additional helpful information:


Is this 3D/can I use it with Maya etc.?
– No, it isn’t 3D (although the Z data is recorded to the XML, and it is open source, so, uh, you can go wild and make something 3D out of it); if you want to use your Kinect with a 3D app try BreckelKinect (Windows only).

What hardware can I use?
– KinectToPin works with the standard Xbox Kinect, as well as the Xtion (a generic Kinect by Asus), although I’ve not gotten that running successfully on my own machine. Has not been tested with Kinect for Windows as far as I know.

Help! I recorded a really long track and I can’t get it converted from XML!
– If you have an xml file that is more than a couple of minutes long, KinectToPin may crash when you try to turn it into AE keyframes. Nick has an older converter-only tool for Processing called FlaePin that may work where KinectToPin fails.

Help! After Effects is giving me grayed-out puppet pins!
– This is a known bug with the Puppet Tool. You need to create dummy pins for all fifteen points before you paste in your tracking data or you’ll get these weird unusable pins.

Help! Microsoft’s Kinect drivers keep installing themselves automatically and taking over for OpenNI!
– You can fix this in Device Manager. Follow David Menard’s instructions here.



The folks behind SimpleOpenNI have created some handy software bundles that will help get you up and running a lot faster. Go to this site and find your relevant link(s) in the “Downloads” menu on the lefthand side of the page. You’ll still need to download Processing separately, as well as KinectToPin itself. Follow the instructions in Part 2 of my tutorial series to get those configured.

In hopes of simplifying things a bit more, I’m also compiling a list of the different configurations people have managed to get working. If you’d like to add your setup, post in the comments using this format:

Capture Hardware:
OpenNI/NITE version (or bundle source):
SensorKinect version:
OSCeleton version:
Recording with simpleopenNI v. OSCeleton?
After Effects version:


Kinect MoCap Animation in After Effects

Tutorial links: Part 1: Getting Started | Part 2: Motion Capture with KinectToPin | Part 3: Building the Puppet Rigging Template | Part 4: Rigging a Digital Puppet

More info: A better way to control the puppet’s head | FAQ and Installation Guide

Project files: Click here to download the After Effects project (CS5+).

Quick note: the text is just transcripts of the videos, so you can read or watch as you prefer.


Shh! A sneak peek at what I’ve been working on…

So what exactly am I doing with all the mocap stuff I’ve been working on with KinectToPin + After Effects? Well… add in expression-controlled facial animation and using Dynamic Link to live-switch unrendered AE comps via Premiere’s multicam setup (I am kind of freaked out that this seems to Just Work), and it looks like we’re about to have an animated Actually Happening. Shhh! :-)


I’ve been figuring this out as I go, but once I have all the elements rigged it should be almost trivial to make new episodes. Also I built the set in PHOTOSHOP which is ridiculous. I don’t have any proper 3D software on my laptop, so the table is all Repousse shapes extruded from rounded rectangles.