Marco Rapino is doing the coolest experiments with Blender. Last time, he linked his N95's accelerometer to Blender. Today, he shows how to do real-time laser or object tracking using nothing more thanÂ a webcam, a laser pointer and a pen.
I already wrote you one week ago and you about how to use a N95 accelerometer in Blender. This time I wanted to show you how to create an object and laser tracking system and use it in the Blender Game Engine. Again, as underlined in my previous post, the aim is to show the endless possibilities that Blender has, also creating a motion capture system (Python coded) that can be used as an input in the BGE :) In this video you can see how does it work, if you want more informations or download the sources, available in a couple of days, you can check it out on my site.
Happy hacking with Blender!
And here is another application of the camera tracking script: http://vimeo.com/5314331
Gooooood Lord Mike, I didn't know somebody had already made something with that! And you definitely had a brilliant idea having that red box for moving the camera! God, I love open source see what comes out from sharing knowledge! I learnt how to track objects from another guy and adapted it to BGE. Then you did the same with me and created another way to use what I did. Let's see what comes next, I don't want to be a MS rival (see E3 in L.A. with the XBos new controller) it would be ridiculous :D, but even if we can get only 10-20% of what they do using open source stuff and our brains just for the pleasure of inventing and sharing more than for money, then that it makes my day!
Great idea Marco..
Why not interface the wii controller?
Thanks Saverio, because they already did it :) Check it out here:
Now anyway I'm curious to use the wiimote again since it has this wiimote plus which should provide z-axis rotation (gyroscope) which is really needed when you want a full tracking system, for example head tracking. Let's see, I love exploring alternative ways of using Blender, for example did you know that you can blow objects in Blender just using a microphone? :D Soon I'll put the script on my blog ;)
i think a 3d controller like the wiimote is interesting for the 3d modeling and animation too..
for example to sculpt, or to record in realtime the camera position..
That looks very promising. Did you think about adding the AR-Toolkit's capabilities ? That would be neat. Regards, Christian.
that's nice! Good job Marco!
I've still got a few questions:
What's your fps in the game engine?
How many fps does your webcam do and what's its resolution?
Where do you see best chances of improving latency, fps and accuracy?
thx for sharing!
Well, that's a possibility. Even though I have to admit I haven't used AR toolkit that much, it looks amazing. Thanks for asking this because I haven't thought about it and I'll see what can be done since it's multi-platform as well as Blender so it looks an excellent combination :)
The fps is around 45-50 in the BGE. There's some slow down due to the socket connection, but not that much.
My webcam has a resolution of 400x300 in the example and 25fps. I recommend to do not have high resolution because that would slow down the calculation part on the image matrix, increasing fps means also having more cpu load since you have more frames to compute per second :)
Well, porting what I did to a c/c++ compiled binary would speed up the BlenderTrack script. using opencv instead of a homemade function plus PIL would improve accuracy. I couldn't use opencv here because my webcam was not supported (damn it!), but I'm definitely planning to buy a new one which is supported. Also because I want to port this to OSX and Linux. In linux I have already a good result using py-gtk using gstreamer, however having one app opencv based for all OS is the best, instead of writing 3 different applications per each :)
Could your system expand to include a live cleaner (than webcam) feed from a vid camera?
If so how close to direct import of video into nodes or sequencer live?
Going off at a tangent, how close to using blender's VSE scopes to preview live video feed sort of really basic Adobe On Location. For video aquisition.
Alternativey does anyone know a decent open source video scopes app to sample a live feed whilst videoing?
yes with some small rework you can use video as input rather than a live webcam stream, if I gotwhat you mean :)
Is that something you would consider doing, firewire input as a live stream?
If the live feed input (firewire + USB) code was a seperate module in blender it could in theory be used in other areas of blender? As a composite node input? As a sequencer input maybe?
Yes, since it's a separate module it doesn't have to be used necessarily with Blender. It can be used in any way you wish since you have a PIL data format stream, so yes also doing composite stuff and sequencing :)
At the moment I'm not considering this because I want to focus on the BGE, creating new way of interactions with Blender and games. However the code is structured in a way that can be reused easily by any other app!
Glad to be able to be your inspiration :)
I found a related newsentry here:
I remember that I took a look at it, but it was not working in realtime.
This community really needs such an active participation which improves the capabilities of Blender.
Keep up the good work !
Impressive stuff. I was working on some similar stuff a couple of months back using open CV in the game engine. see:
I couldn't quite get the right balance between accuracy and responsiveness though.
This just amazes me. You are a genius. Thanks, for showing this.
Kudos for the inventive spirit, guys... Really inspiring stuff!
Cheers guys :)
Yes it was the hardest part to find some balance and still is really sensitive to light and shadows sometimes
I just checked the video and yes it wasn't in real time even though looks interesting anyway!
Let's keep inventing guys, Blender needs our creativity :D
que proyecto natal ni que nada XD
wow, todo lo que se puede hacer con el Blender :)
mmm; creo que se podrÃa usar con lo que llaman "realidad aumentada"...
a mas de uno nos encantarÃa :)
el video del Open CV tambiÃ©n se ve genial...
Thanks for sharing, great great stuff. Can't wait to see more of your work in the future.
Marco, Actually my inspiration came from a colleague who is obsessed with HCI(human computer interaction), and I always wanted to try to integrate computer vision with Blender. Your code helped me getting started with it. I considered OpenCV, but it's too complicated for me. Good stuff!
Thanks Miguel, I'll definitely keep you guys updated!
Cool Mike that you have such inspiring colleagues :D don't be scared by opencv, looks worse than what really is :)
your ideas on using rather uncommon tools to loop information into bender are just awesome:)
I really enjoy watchin those videos and I'm eager to try it out myself one day or another.
Great work, cool inventions that you made there!!
A super program, i just got it running sucessfully.
My son and I want to use it for 2D position tracking of model ships;
if of interest, i will report on modifications.
As we are just starting to familiarize with python:
Anybody an idea to ease and speed up program to 2D?
I try to identify and track several objects say red, green, yellow and blue led's.
Thanks for sharing
OpenCv brought the speeding up of the tracking process i was looking for. For those who are interested please be refered to http://www.jperla.com/blog/2007/09/26/capturing-frames-from-a-webcam-on-linux/
Otherwise very similar.
the music of the video made my day :) who is it? if it was made, would you mind telling me what software was involved? thanks.