Advertisement

You're blocking ads, which pay for BlenderNation. Read about other ways to support us.

11 Comments

  1. What's the actual point in this news post? Is a new plugin released? Or is this just general information?

    I'm not too lazy to read the referenced websites but i'd like some more information in the news post. Like what are the results up to now in real time capturing? How far is the development? Maybe a short word from the developer?

  2. i agree with the topic, doubt its blender. to me, doesnt even look like 3d render. to stiff, plasticy, awkward
    but you gotta love weird al's "ebay" song :D

  3. Hi guys,

    I'm the developer and this post caught me a little by surprise because I have not been actively working on it since early April because, unfortunately I had a death in my family that was a little sudden and I've been out of the country and away from my studio since that happened and unable to work on Panda Puppet.

    The idea behind the plug-in is to create real-time animation/Machinima in Blender's game engine so that animated characters can be performed in real-time by an external control device like a joystick or wii remote. Performances can be recorded in real-time like conventional Machinima, or recorded as IPO data that can be tweaked, edited and then rendered out like conventional 3D animation. Basically, it speeds up the character animation process and makes it a little more like doing puppetry in 3D.

    Panda Puppet is currently in what I would call a pre-release Alpha stage. In other words, it has basic functionality, but it's not at a point where I have something practical enough to release for people to use. The real-time recording works more or less fine, but there are still problems with recording to IPO. I will be back home in a week or two and will be resuming development of Panda Puppet in a couple weeks.

    If you'd like to get an overview of the project check the Blender Artists thread or read the relevant posts on my Machin-X: Digital Puppetry blog - http://machin-x.blogspot.com/search/label/Panda%20Puppet - neither is completely up to date, but I'll be correcting that when I am back to work on this in a few weeks.

    I hope this clear up some of the apparent confusion!

  4. Hi Andrew,

    this sounds really great! I think it could be very interesting to use it in interactive video installations. Even more fantastic it would be to have some other possibilities with interfaces. It could be possible to direct the puppet (or anything else) by using various input data (for example sound frequencies, temperatures, light, ...) from not conventional interfaces (mouse, joystick).

    I wish you great success with your ideas!

    Carlinhos

  5. I agree Carlinhos and that's planned for. I have not attempted it (yet) but what I am doing would be easily compatible with the I-CubeX sensors, among others. Drivers just have to written in Glove PIE. There's all kinds of exciting possibilities for this I hope.

Leave A Reply

To add a profile picture to your message, register your email address with Gravatar.com. To protect your email address, create an account on BlenderNation and log in when posting a message.

Advertisement

×