Advertisement

You're blocking ads, which pay for BlenderNation. Read about other ways to support us.

Motion capture with an Xbox Kinect

23

We've reported on some cool Kinect hacks before, and here's a new one: real-time motion capture.

About the Author

Avatar image for Bart Veldhuizen
Bart Veldhuizen

I have a LONG history with Blender - I wrote some of the earliest Blender tutorials, worked for Not a Number and helped run the crowdfunding campaign that open sourced Blender (the first one on the internet!). I founded BlenderNation in 2006 and have been editing it every single day since then ;-) I also run the Blender Artists forum and I'm Head of Community at Sketchfab.

23 Comments

  1. Its possible to do this for free with open ni. I'm working on a rather hacky blender solution at the moment, not too far off! Dont have a kinect though so am working from openni sample data. will post on BA when i have something that actually works ;)

  2. meialua, please let me know if I can help with testing. I've been hacking away with libfreenect and OpenNI for the last few weeks. The python wrapper in libfreenect is fantastic, but it's NITE that has the proprietary skeletal capture, so OpenNI seems to be the way to go (at least for the moment).

    Trouble is, I'm completely new to C++ so am struggling to build anything that will actually compile, let alone output something useful for use in Blender (ie, a python wrapper)!

    BTW, a user called gamix is probably working on the same thing here: http://groups.google.com/group/openni-dev/browse_thread/thread/4d3a49241a566bf9

  3. If this or anything like it is released open source I will Buy an Effin' Kinect! This would prove to create some Awesome scenes in blender animation world!

  4. I'll run out and buy a kinect when there is an opensource mocap program that's blender compatible released.

    MS should capatilise on this since I'd say they have the software already in existence (granted it'd probably be windows only... lame) If they released free software for the PC which captured kinect data into mocap files then every indie game/video maker who works in 3D would run out and buy one. With this I could block out the animations for a web-series in a day!

  5. Cool would love any help or pointers - heres the progress I've made so far ( really should make a BA thread for this ;) but hey ...

    1) managed to compile and run the OpenNI / NITE sample applications from source ( not too hard in fact ).
    2) have integrated a c++ sockets library into the OpenNI skeleton tracking sample with a view to creating a basic server that spews out the skeletal data over a socket.

    and thats about it so far - am getting a few errors at this point that am not sure how to overcome ( although it seems to all be to do with how I've integrated the c++ socket stuff - its a lot harder than the python socket implementation :( )

    The eventual plan would be to have a python script in blender listening over that socket for the skeleton data and positioning bones according to that data - or even write a BVH file totally seperately to blender for later import ( in fact this might be the easiest thing to do rather than live capture ).

    will start a thread for this when I have time instead of cluttering up things here! ( sorry BN crew.. )

  6. Looks like you're taking the same approach as the guys working on KinEmote - sending capture data to broadcast on a server address.

    Is it more difficult to wrap existing openni functions in python than it is for your chosen method? Forgive the newbie question, but I'm looking at the Kinect also being used as a controller in Blender game engine, so for that idea to work would require access to a configurable gesture recognition library, ie direct access to openni via python.

  7. Well - if a proper python wrapper exposing the whole OpenNI / NITE suit does materialise, then yes there will be no need for a 'server-like' implementation. That approach just seemed a bit more achievable for me since I'm not experienced in generating python wrappers for c++.

    That said, I'm sure it would be fairly easy to do using SWIG / SIP / Cython etc. as there are not a massive amount of functions to expose in the library, its just I dont know how to do it myself ( and dont really have the spare time to learn it at the moment ).

  8. Thanks for the response - understood, it is completely outside of my own experience (and in the months it would take for me to learn how to program in c++ and generate the wrappers, a far more superior library would probably have been developed!)

    The offer of help testing still stands, by the way. I'll keep an eye on the BA forums for your announcement.

  9. with OSCeleton lib you can easely send out the data in OSC from the kinect OpenNI wrapper, and grab it in any OSC compliant software...
    I have it working in Pure Data, but Blender should be realy easy too with f.ex. pyliblo (OSC for Python)
    I'll try this in the following weeks, but if anyone has a try before go ahead ;)

Leave A Reply

To add a profile picture to your message, register your email address with Gravatar.com. To protect your email address, create an account on BlenderNation and log in when posting a message.

Advertisement

×