Advertisement

You're blocking ads, which pay for BlenderNation. Read about other ways to support us.

Real-time 3D scanning with the Kinect controller

34

Maybe this can inspire someone?

About the Author

Avatar image for Bart Veldhuizen
Bart Veldhuizen

I have a LONG history with Blender - I wrote some of the earliest Blender tutorials, worked for Not a Number and helped run the crowdfunding campaign that open sourced Blender (the first one on the internet!). I founded BlenderNation in 2006 and have been editing it every single day since then ;-) I also run the Blender Artists forum and I'm Head of Community at Sketchfab.

34 Comments

  1. WOW!

    Just my thought: Woukld it be possible to connect three-four Kinects to 'fill' the black 'shadows'? And thus create a full 3D environment in realtime?

  2. @Erik - Unfortunately the signals that Kinekt's transmit (non-visible spectrum light) create interference with each other as each unit can't tell which signal belongs to it, and which belongs to another unit. You could get away with using multiple units, one at a time on a stationary subject though (ie have one in front, one on either side, one above and one below and then take snapshots with each, one after the other, for a full 3d scan. It wouldn't work properly with moving subjects such as mocap though.

  3. How about using Shapeways to print pose-able armature for a 3D model. Then using Kinect to detect its pose and transfer it to an armature in blender.

  4. Excellent! But.... now the big one I'm waiting for... Can we mocap with the kinect???

    If anyone sees that, please do let me know, cause as soon as that's possible, I'm running out to the store to get me one!

  5. Mocap with this would be pretty sweet, as that's basically what it's designed for.
    Writing/porting the algorithms it uses would be a lot of work though, probably.

  6. @macouno, afaik the recognizing of bones of the human body is done inside the xbox, with some closed source software. I don't know if there is any to extract that kind of data from it, maybe in the form of an homebrew game or something? If it is possible to recognize stuff with xna gamestudios, there might be a way to extract it somehow? just thinking out loud here :)

  7. Oh! shit! Mind blowing )) First I thought that it's just an overlayed effect but when he started to rotate the room's space... Incredible )) It's feeling like you're a ghost :)

  8. @Blendiac
    I was also thinking of this interference with other Kinect units. I wonder, in a multiple Kinect setup, if you could replace the IR emitters so that each would have its own non-overlapping wavelength. The IR filters for the camera sensors would need to be replaced as well. I wonder how far into the infrared spectrum the cameras are capable of reaching.

  9. Point clouds to meshes is possible in blender 2.49 I think. I know that meshlab is able to use several methods to change points clouds to meshes. Here are several links about that.
    http://insight3d.sourceforge.net/ (can create mesh and texture from photos) Doesnt work great for me
    http://nico-digitalarchaeology.blogspot.com/ Workflow for getting meshes from photos
    and last but not least. Libmv is able to now to create a point cloud(.ply) from photos though it is not that intuitive.

  10. I wonder if you could up the setup a hd feed next to the kinect and combine the depth map and rgb with a filtered(edge?) hd feed to get a nicer key.

  11. @Scott
    I believe MS stated that the 360 was just the beginning for the Kinect. I would be surprised if the didn't eventually add it for the PC.

  12. Now that's a reason to buy Kinect... I happen to have X360 but rather use the pc. Anyhow, this has some potential and I would very much support Microsoft by buying this piece of hardware. ...Now we just need to think some cases where we would need a realtime 3D input on such a large scale...

  13. To use two or more in the same area I wonder if you could use polarising filters on the IR sender and reciever. On the first kinect have the polarising filters on send and receive at the same orientation, on the second have them at a different rotation. Only question is whether the filters would attenuate the IR beam too much.

  14. I would love to have a buyable software that allows a person to collect data  from 2-3 kinects at the same time and put data into a full 3d-4d spectrum.

Leave A Reply

To add a profile picture to your message, register your email address with Gravatar.com. To protect your email address, create an account on BlenderNation and log in when posting a message.

Advertisement

×