http://vimeo.com/25964679
The Blender Loop Station (Bloop) is a rapid performance animation tool. It is implemented as an add-on for the 3D modeling and animation suite Blender.
AG Digitale Medien writes:
Bloop lets the user quickly animate a 3D character in a short amount of time without needing much experience or technological knowledge. It combines motion-capturing, digital puppetry and natural language control. The result is a form of interaction which does not interrupt the users work flow: spoken commands control the system's settings and actions while the user can keep acting out the animation without having to run up to his keyboard after every recorded move. We use a loop station metaphor as known from live music performances, where a musician controls the loop station by physical interaction while he records sounds. We switch the modal channels of the musical loop station to create a loop station for animations.
Features
In one workflow you can
- Create new mappings (user feature to character feature) via gestures
- Calibrate these mappings
- Quickly record animations
- Layer recordings for different mappings
- Record animations with more than one user acting on one character
Link
29 Comments
Wow !! I saw some similar technology but using AR.
end result doesnt look like much. In fact it might leave you with more to do than starting from scratch.
I was thinking the same thing. Has potential, but at the moment it looks like you don't really receive a very good end result. BUT it could also be the rig character they're using. Perhaps it's not a very dynamic rigging/character. Or I'm making excuses. haha
Good start, but could be improved.
Might be usefull for blocking out keyframes, you know, Bake, select your Key poses, delete the rest, Start tweaking from there. Never been a fan of motion capture in general though.
On the other hand, rumors are popping up about a next gen kinect precise enough to read lips and see finger movements. It's just going to get better.
Very good news indeed; obtaining the Microsoft Kinect SDK might prove to be a licensing challenger however. Open NI (http://www.openni.org/) has a project underway that support Kinect and other mocap technologies. There's a few companies who have started using it in their product. For those interested, posted about it here:
http://forum.reallusion.com/Topic95198-281-1.aspx
Why not use libfreenect? Microsoft can go hang its licensing conditions.
Agreed and precisely the point; btw, libfreenect (http://openkinect.org/wiki/Main_Page) is part of the Open NI project. ;)
very cool stuff!!!
The video is hilarious. :D
I thought exactly the same. ;)
Awesome!
"geh mal kacken oder so" :D epic
Cool, but too bad it requires a Kinect. That's money I don't have. :)
i wonder if they would be able to program it to use 2 web camera's at a fixed distance, since that's all the kinect is. they wouldn't have to be super great, just enough to catch he contrast of joints, i suppose there is some programing that's established in them so it can recognize the hierarchy and relative position of those joints :/ itd be nice if someone could set something up so 2 low to average quality webcams could act in its place
That's not all the kinect is. It's main feature that would be hard to reproduce is an infrared projector. which projects an array of infrared dots into the room
:< o well darn, guess im gonna have to buy me an Xbox then lol
http://kinectforwindows.org/
Apparently, Windows specific hardware is coming soon.
wait thou, (correct me if im wrong but) isnt a inferred projector what the wii thing uses... i think im wrong but i thouth thats what i heard
You are wrong. The Wii has an infrared camera in the WiiMote which detects infra red LED's in the bar thing which you have to put by your TV. So all the WiiMote sensors sees are two sets of LED's and it can use the rotation and distance apart to detect roughly where it is.
jefält ma, jeht ab! richtisch jeil!
Hey Weihnachtsmann I need a Ms Kinect, cant wait to try this funky cake;)
Probs nach Bremen:)
That's so cool and useful. It's funny--I was thinking about something like this just a few days ago. I hope they come out with something like this using only digital cameras and webcams--I don't have a Kinect.
gut geschaft! I would use it as a tool to sketch out the animation and to use it as a starting point, especially for timing.
I would nail the important keyframes, tweak them, delete the rest and then go inbetween.
hope to see improvements soon.
I could not make it work under Windows 7 and Blender 2.6.
I love ASUS computers...
Buggy as hell, I don't think that the animation you get this way still reminds what was actually captured. At this stage of develompent it's more like interesting approach than something we should be really excited of.
look at this has effects made in blender: http://www.youtube.com/watch?v=zbYR2DWCE8c&feature=player_embedded#!
Is it possible to upload KinectToOsc binaries for 64-bit Windows? I'd like to test the system. I've even bought Kinect for this, but I coudn't run KinectToOsc.exe on my Windows 7 64-bit.
They are having the same problem everyone has when they only use one camera. Not enough tracking. IPI is starting to use two kinects but, kinects are so expensive.Blender should be looking how IPI does their motion tracking with 6 playstation eye (much cheaper). http://www.youtube.com/watch?v=msRtIZX529Q&context=C3fe8ebeADOEgsToPDskLTyxImI6pr-R5SfFeN4u3T
Hi, I have a problem When I click the KinectToOSC, it just brings up a window that says KinectToOSC has stopped working. Ive tried it on Windows 7 32bit, 64bit, Blender 2.59 installed all the software (Kinect SDK, Visual Studio) Any suggestions?
Thanks!