Puppetry and Blender. A real time capture while manipulating your character in Blender. Keith Lango wrote about on his site last March and you can read the Blender Artist thread.
Here's an example of a real time 3D puppet on YouTube though I don't believe it's done with Blender.
Andrew From Mexico blog has a lot of information as well.
11 Comments
You forgot ht in the link to the thread.
It's not:
tp://blenderartists.org/forum/showthread.php?t=91606&highlight=puppet
but:
http://blenderartists.org/forum/showthread.php?t=91606&highlight=puppet
AniCator,
Thanks.
Fixed it
What's the actual point in this news post? Is a new plugin released? Or is this just general information?
I'm not too lazy to read the referenced websites but i'd like some more information in the news post. Like what are the results up to now in real time capturing? How far is the development? Maybe a short word from the developer?
yea, this one's rather odd.
Rather exciting, I must say.
looks interesting!!
-epat. :)
i agree with the topic, doubt its blender. to me, doesnt even look like 3d render. to stiff, plasticy, awkward
but you gotta love weird al's "ebay" song :D
Hi guys,
I'm the developer and this post caught me a little by surprise because I have not been actively working on it since early April because, unfortunately I had a death in my family that was a little sudden and I've been out of the country and away from my studio since that happened and unable to work on Panda Puppet.
The idea behind the plug-in is to create real-time animation/Machinima in Blender's game engine so that animated characters can be performed in real-time by an external control device like a joystick or wii remote. Performances can be recorded in real-time like conventional Machinima, or recorded as IPO data that can be tweaked, edited and then rendered out like conventional 3D animation. Basically, it speeds up the character animation process and makes it a little more like doing puppetry in 3D.
Panda Puppet is currently in what I would call a pre-release Alpha stage. In other words, it has basic functionality, but it's not at a point where I have something practical enough to release for people to use. The real-time recording works more or less fine, but there are still problems with recording to IPO. I will be back home in a week or two and will be resuming development of Panda Puppet in a couple weeks.
If you'd like to get an overview of the project check the Blender Artists thread or read the relevant posts on my Machin-X: Digital Puppetry blog - http://machin-x.blogspot.com/search/label/Panda%20Puppet - neither is completely up to date, but I'll be correcting that when I am back to work on this in a few weeks.
I hope this clear up some of the apparent confusion!
Hi Andrew,
this sounds really great! I think it could be very interesting to use it in interactive video installations. Even more fantastic it would be to have some other possibilities with interfaces. It could be possible to direct the puppet (or anything else) by using various input data (for example sound frequencies, temperatures, light, ...) from not conventional interfaces (mouse, joystick).
I wish you great success with your ideas!
Carlinhos
This text has a good tutorial of how to code your own 3d interface system as well as camera based extrapolative modeling. They used to have complete animated videos for a k-12 course in cs, but all I can find is this mirror...
http://remedials.org/BalliesScript.txt and as html http://remedials.org/#11
I'll hint that it took us less than an hour to code, and anyone can wrap a camera feed with python! ;)
I agree Carlinhos and that's planned for. I have not attempted it (yet) but what I am doing would be easily compatible with the I-CubeX sensors, among others. Drivers just have to written in Glove PIE. There's all kinds of exciting possibilities for this I hope.