Here's a cool project where Blender is used to remote-control a rig and make it follow a pre-programmed path.
Part of the beauty of Blender is its willingness to be hacked and adapted with Python, right from within its own UI. Need to export an object’s animation to some arbitrary text format? Just pop over to Blender’s scripting screen set and away you go. For those of us new to Python, there’s even a live console so you can test snippets of code immediately, or just find out what methods a particular object supports.
Last year, I built a motion control rig (a real one, out here in meat-space) with 5 axes of motion (well, 6 if you count focussing as well), but needed a way to actually design and program the motions. I didn’t want to be limited to simple “move here, then here, then here” linear motions - I wanted full-on compound moves, sequences of moves; tracking shots, arbitrary easing… and it would be really nice to be able to visualise what I was doing on a screen first, before actually getting the rig moving. Exactly the sort of thing Blender does at its heart.
So if you’d be kind enough to put on your safety goggles and stand behind the yellow line, I present: my homemade mo-co rig.
A bit of background on the way it integrates with Blender:
The motion control server software runs on a Mac (it’s written in Swift) and connects to the various motor drivers on the rig. In “Live mode”, it lets you drag the sliders on screen to move the rig around in real time, doing some cunning filtering to keep acceleration and deceleration within the physical capabilities of the axes, and stopping you from running into the end-stops. It can store a sequence of rig positions and run between them, but for more complex stuff like compound moves - moving a camera while panning, tilting and focussing to keep a particular subject centred - creating a sensible user interface would have taken 3 times as long as building the rig in the first place.
But Blender already has all the animation tools you could dream of: keyframing of multiple parameters, graph editing, easing, constraints (including the essential Track To constraint). So why reinvent the wheel?
I created a virtual version of my rig in Blender and a simple export script that steps through my animation and spits out a simple ASCII file. Each line of the file represents one frame, and lists the 6 axis positions for that point in time. My server software can load that file up, reset the rig to the first frame positions, then “play” through the file. Job done.
The one other feature I realised I needed pretty quickly was some way to just send the virtual rig’s current position to the real rig immediately: when I set up objects in front of my real rig, I’d have to get out the measuring tape and measure where they were in real life, so I could create their virtual equivalents in my Blender scene. But I needed a quick way to confirm I’d got the measurements right - no point in setting up an animation that has the camera tracking an object if they didn’t line up in real life.
Thankfully Python lets you communicate with UDP, which means you can have Blender send information to another app immediately. So I created a little script that sends a UDP packet containing the virtual rig’s current positions straight to the server software, moving the real rig to match. It gives me a quick way to check the camera’s pointing in the right direction, and that the framing is going to look the way I want, without having to actually run through the whole animation.
There’s more information about all this on my blog, along with links to all the (horrible, hacky) code on github. Be warned - this is not a simple project to replicate - it’s a combination of hardware, firmware (on the Arduino/Teensy motor drivers), software (the mo-co server) and scripting (Blender) and even I seem to have forgotten how I got parts of it to work… but I hope it inspires people to have a go. And it’s all possible thanks to Blender.