Will Jackson, Director at Engineered Arts Limited writes:
RoboThespian started life in 2005 as a rudimentary ‘robotic puppet’, and over years of intensive experimentation has developed into a sophisticated and complex creation, drawing on the talents of dedicated engineers, designers and organisers working to continually improve software, electronics, mechanical design and conceptual development.
It is at once, both an artistic endeavour and an advanced engineering project.
As of February 2010 Engineered Arts Ltd employs 8 people who spend the majority of their working days involved with the perfection of an ‘acting machine’
Why use Blender?
In the early days we programmed robot movements using graphs (like IPO curves) to generate robot movements – it was difficult and not at all intuitive, Try animating a character in FK only with no image of what your doing on screen to get a feel for how hard this is. We couldn’t really do it ‘off line’ at all which meant having a real robot hooked up to see what you where doing.
With a background in 3D animation I knew there where easier ways, but we didn’t want to get into using a closed source licensed application. First because we couldn’t extend the code for our special purpose, very easily, second because we wanted to include the programming software for our users at no cost. And Third because we couldn’t justify the time and expense to write the application from scratch.
About three years ago a friend showed me a Blender animation project, and I realised immediately that we had the answer to our dreams.
It has taken some years and a fair bit of learning to perfect the RoboThespian-Blender link, it now works pretty well.
How we do it.
Our virtual RoboThespian model is rigged for IK and FK with a control to mix the two methods if desired, as not all the robot features have bones, (eye graphics, face colours etc) we also have some floating slider and dial controls that can be animated for those features. A Python script creates an array of all the necessary data, and sends it over a TCP socket connection to our back end software.
The movement and position data are stored in a MySQL DB and can be replayed by the robot at any time – so once programmed Blender is not needed to make the robot work.
We have also developed a way of ‘blending’ multiple robot motion routines after they are recorded. For example the robots mood and movements can change on the fly by mixing and modifying pre set behaviours. We are also working on a virtual RoboThespian model that can be driven in reverse, like using a motion capture rig.
We had real trouble getting the relevant joint data in the right format, which needs to be a relative Euler angle resulting from the combo of the IK and FK inputs – not the bone rotations. This turned out to be as simple as multiplying the inverses of the quaternion matrices – doh!
Many thanks to Arne Laub, Jon Topf and Glen Pike for the months of work they put into the project. And to Cery’s Marks for the great RoboThespian animations she creates.
What’s coming next?
One of our current projects involves three RoboThespian robots working together on stage, with theatre lighting and projected video mixed in.
We have a Blender model to control the whole set up including virtual models of MAC 250 moving head lights. We can animate the whole show virtually and it plays back on the real hardware just the same way.