Actually Happening

“Machine Politics” — Actually Happening animated with Kinect motion capture

It’s finally here! Our first animated episode of Actually Happening!

 

The production process:

Instead of keyframing the characters by hand to match the existing audio track, “Machine Politics” was created using Kinect motion capture. I think the visuals add a lot to what was already one of my favorite bits we’ve ever done. There are a few spots where it gets a tiny bit uncanny valley, but there are also moments that really freak me out with how natural-looking they are. Kevin’s facial expressions in particular just… look like Kevin. It also looks like a real live panel game! And using the traditional panel show format lets us get around some of the serious limitations to this technique (i.e. characters can only face forward).

The skeletal tracking data was captured in Processing with KinectToPin running SimpleOpenNI, then applied to multi-layered puppets rigged in After Effects. Facial animation was done with a combination of automatic lip sync to audio waveforms and a couple of Motion Sketch nulls controlling smile-vs-frown and eyebrow height. I then switched between the 10 different camera angles (all using the same puppet precomps as sources) by bringing them into Premiere via Dynamic Link and creating nested multi-cam sequences.

All that meant there was a lot of asking software to do things it wasn’t meant to do, and I spent a good bit of the animation process going “I can’t believe this is actually working…” There were a few hiccups, though, and lessons learned for the future: next time I’m going to do the sequence edit before I add the Kinect and lip sync data — there were close to a million keyframes in the project at that point and Premiere really started to choke. But now that everything’s rigged and ready to go, I could probably turn around a new episode in a single day. Which, for several minutes of full-color, full-motion animation, is insane.

Things are about to get interesting.

Leave a Reply