Maybe this can inspire someone?
Advertisement
You're blocking ads, which pay for BlenderNation. Read about other ways to support us.
About the Author

Bart Veldhuizen
I have a LONG history with Blender - I wrote some of the earliest Blender tutorials, worked for Not a Number and helped run the crowdfunding campaign that open sourced Blender (the first one on the internet!). I founded BlenderNation in 2006 and have been editing it every single day since then ;-) I also run the Blender Artists forum and I'm Head of Community at Sketchfab.
34 Comments
impressive, any links to source on this? git or otherwise?
WOW!
Just my thought: Woukld it be possible to connect three-four Kinects to 'fill' the black 'shadows'? And thus create a full 3D environment in realtime?
Blenderificus,
The source + more videos are on his website:
http://idav.ucdavis.edu/~okreylos/ResDev/Kinect
There are also some more videos on his you tube channel: http://www.youtube.com/user/okreylos
I can think of a whole lot of 3D related things the Kinect could be useful for, but honestly, more than anything I can't wait for Minority Report style computer control!
Also see this Youtube channel:
http://www.youtube.com/user/KinectHacks#g/u
they seem to be re-uploading all the cool Kinect hack videos from around Youtube.
@Erik - Unfortunately the signals that Kinekt's transmit (non-visible spectrum light) create interference with each other as each unit can't tell which signal belongs to it, and which belongs to another unit. You could get away with using multiple units, one at a time on a stationary subject though (ie have one in front, one on either side, one above and one below and then take snapshots with each, one after the other, for a full 3d scan. It wouldn't work properly with moving subjects such as mocap though.
He released some software:
http://www.youtube.com/watch?v=N9dyEyub0CE
That's very exciting, so how long until we have a motion capture system in Blender? :0)
Wow, this Kinect thingy seems to be very usefull. I still wonder why MS does not want it to be used on PCs...
How about using Shapeways to print pose-able armature for a 3D model. Then using Kinect to detect its pose and transfer it to an armature in blender.
Excellent! But.... now the big one I'm waiting for... Can we mocap with the kinect???
If anyone sees that, please do let me know, cause as soon as that's possible, I'm running out to the store to get me one!
Mocap with this would be pretty sweet, as that's basically what it's designed for.
Writing/porting the algorithms it uses would be a lot of work though, probably.
@macouno, afaik the recognizing of bones of the human body is done inside the xbox, with some closed source software. I don't know if there is any to extract that kind of data from it, maybe in the form of an homebrew game or something? If it is possible to recognize stuff with xna gamestudios, there might be a way to extract it somehow? just thinking out loud here :)
I have been looking into this here are some links I have found for motion capture.
http://i61www.ira.uka.de/users/knoop/VooDoo2/doc/html/index.html
and I need to do a bit more research but some on used blob and opencv to get some effect.
http://www.youtube.com/watch?v=Rq6wcPPxqEc
Oh! shit! Mind blowing )) First I thought that it's just an overlayed effect but when he started to rotate the room's space... Incredible )) It's feeling like you're a ghost :)
There is developing at the openframework front as well:
http://www.openframeworks.cc/forum/viewtopic.php?p=24948#p24948
by Theo: http://vimeo.com/16734124
Hope we will have soon nice ways to import point clouds for 3D Scanning, or live feed for Augmented Reality in the BGE. MoCap would be of course very cool... but this is wishful thinking.
@Blendiac
I was also thinking of this interference with other Kinect units. I wonder, in a multiple Kinect setup, if you could replace the IR emitters so that each would have its own non-overlapping wavelength. The IR filters for the camera sensors would need to be replaced as well. I wonder how far into the infrared spectrum the cameras are capable of reaching.
Point clouds to meshes is possible in blender 2.49 I think. I know that meshlab is able to use several methods to change points clouds to meshes. Here are several links about that.
http://insight3d.sourceforge.net/ (can create mesh and texture from photos) Doesnt work great for me
http://nico-digitalarchaeology.blogspot.com/ Workflow for getting meshes from photos
and last but not least. Libmv is able to now to create a point cloud(.ply) from photos though it is not that intuitive.
I wonder if you could up the setup a hd feed next to the kinect and combine the depth map and rgb with a filtered(edge?) hd feed to get a nicer key.
What does the Kinect (Hardware) do other than run two webcams?
@Scott
I believe MS stated that the 360 was just the beginning for the Kinect. I would be surprised if the didn't eventually add it for the PC.
Amaaaaazingggggg
then you can "touch" virtual objects and do things!!
Or writte in a virtual board...
@Thomas: The kinect hardware is actually an infrared projector, thats projects points, a infrared camera thats sees the points, to calculate the depth map and a webcam for the image/texture...
see: http://www.youtube.com/watch?v=nvvQJxgykcU
PLOT TWIST!
The room is the 3d model! :O
Now that's a reason to buy Kinect... I happen to have X360 but rather use the pc. Anyhow, this has some potential and I would very much support Microsoft by buying this piece of hardware. ...Now we just need to think some cases where we would need a realtime 3D input on such a large scale...
To use two or more in the same area I wonder if you could use polarising filters on the IR sender and reciever. On the first kinect have the polarising filters on send and receive at the same orientation, on the second have them at a different rotation. Only question is whether the filters would attenuate the IR beam too much.
Pawnage!
another video: http://vimeo.com/17075378
:-)
This doesn't really qualify as a Blender spotting, but here you go:
http://techland.com/2010/11/26/top-five-uses-for-your-kinect-besides-gaming/
Where can I buy this device?
Where can I buy this device?
Walking in the prensece of giants here. Cool thinking all around!
You can try PhotoFly .
http://labs.autodesk.com/utilities/photo_scene_editor/
With this program, you can take 2D pictures of an object and it will make a 3D model of it automatically !!
Look at some videos on YouTube it is just amazing!
I would love to have a buyable software that allows a person to collect data from 2-3 kinects at the same time and put data into a full 3d-4d spectrum.