It's only a few weeks since the Google Summer of Code 2011 started and the projects have already resulted in pure awesomeness.
Especially the so-called "Tomato Branch", where Sergey Sharybin is developing camera tracking for Blender has already resulted in some nice and shiny features! He is implementing algorithms of the libmv-project, a tracking library, into Blender. And while we currently do not have 3d camera tracking yet, we already have a pretty decent 2d tracking system. It is really usable and quite stable, and with just a few tiny markers you can achieve impressive results!
Here are 2 tutorials if you are interested to see what's already possible.
Four corner pin tracking with Blender
And, most impressing, image stabilization of shaky footage in Blender, a method developed by Francois Tarlier.
These are just examples of course, and the workflow can and will be heavily improved.
But it gives a good idea what a powerful tool is being developed here.
You can find some more information about the project in the project's wiki page. and at Graphicall.org
30 Comments
This is pure awesomeness! I've already tried it out, but the examples from Sebastian and Francois are great!
Can't wait for the 3D tracking tools!
And here is another project done with 2d tracking / rotoscoping.
http://vimeo.com/25910107
Just want to add that Francois Tarlier already posted an updated about a much easier method for stabilization:
http://www.vimeo.com/25785504
And Sergey is coding so fast... there's already lots of cool new features. :)
Yeah lol, the method in the tutorials is partly outdated already - Sergey is so fast!
I am impressed with their work... I think that Matthias needs some credit too as well as Keir is working on the 3d tracking coming soon.
Agreed
Apart from Sergeys really good work we need to say a huge "thank you" to
- Keir and
- Matthias
From the libmv project who designed and implemented the algorithms which made this all possible.
Amazing work guys. Keep it up!
This is great! Currently the only thing holding me back from converting to a linux operating system is that Adobe After Effects won't run on Ubuntu. That would be fantastic if Blender could replace AE. I wonder if area tracking (like what is done with the program Mocha) is also on the Blender agenda? I find area tracking is often superior to point tracking.
This is so exciting!!
Congrats to all involved =D
-Lee
Awesome.
Going off on a Tangent :)
One question. When you are done rendering the composite how do you match the sound with the movie?
I understand how he gets the composite video, but not how he gets the sound from the Sintel Movie?
Do you go back and use the video editor and add it that way?
The reason I ask is I want to add some muzzle flash and smoke to a actor holding a gun.
I don't understand how you add sound at the right moment when using composite or how you re add the sound to a scene if you first break it into individual images.
Thanks,
Greg
Wow, I'm really looking forward to this!
\m/ Awesome! \m/
Great work GSoC!
Daniel: Blender will not just get area tracking but a realy matchmove solution :)
Greg: I guess using the Video Sequence Editor (VSE) in Blender is the way to go!
And found this :
http://vimeo.com/25277775
Someone could post something like a template .blend. This way Its only a matter of changing the video and redo the tracking...
Really awesome work. I will use it for sure.
@Gottfried:
I guess it would be much easier to use a transform strip in the VSE and feed position, scale and rotate from the tracker values. Setting up a camera and the plane with a movie texture sounds a bit overengineered to me ;)
is any kind of 3D motion tracking planned? I haven't been following as closely as I would like to.
This is an awsome stuff. Very please that it is integrated into Blender! I will try it very soon.
Congratulations
Wow that's amazing work :D
@Matt Heimlich
Keir recently added basic 3D motion tracking to libmv and it is currently being integrated into blender by Sergey.
Will this code (or libmv) include automatic detection of lens parameters? This will be critical for compositing physical and rendered elements. The demo above uses an orthogonal camera, but if I understand correctly, the straight edges of the rendered/composited video plane will never quite match the edges of the real source footage due to lens distortion (unless the original footage has distortion correction applied previously.)
Still very young in it's development but having a solid 2D/3D camera tracker in Blender is an ace idea.
I would prefer that the execution of the tracker (both 2D/3D) would take place in a node in the compositor. As it stands now it's way too cumbersome.
Still exciting stuff though,
--
@Jeff B
To start there won't be automatic detection of lens parameters. We are building a tool to make it possible to calibrate your camera very accurately, but it will require that you print out a calibration pattern and take some photos of it with the camera you shoot with.
We understand that having to calibrate your camera is inconvenient, and we are working to remove this limitation, but it won't get added for awhile.
Thanks for everyone's enthusiasm! Things are moving fast on both the libmv & Blender side; I'm excited to ship something that you guys can use in production.
One of my dreams come true - the others being global peace and
bakeable GI right within 2.5. Not much hope for the former, but
for the latter, methinks Cycles shoud be able to grant it quite soon ..
As always : kudos and thanks to the devs ;) ! You guys rock ..
Cheers,
______
JG
really nice wrk
i try the other link some time it is getting error so kindly send the corresponding link for the ubuntu 10.04 32bit
This looks like an amazing new feature for Blender, and having seen a few videos of it, it looks much better than Icarus, so I would like to thank everyone involved in it's development! I am very excited about the potential of this software, but, unfortunately, I have been experiencing a problem with it where both auto and manually placed features don't track past a few frames of the video. To clarify, I have a 1080p HD video with minimal motion blur and motion that can easily be tracked in Icarus. Also, judging by the speed and accuracy of the track on the first few frames, this seems like an error. If anyone is having a similar problem or might know the answer, I would really appreciate a solution. Thanks again to everyone involved in the project and whoever might answer my question!
here is an open source camera tracker that learns from its mistakes in real time http://info.ee.surrey.ac.uk/Personal/Z.Kalal/
maybe that feature can be added to blenders camera tracker.
What is blender tomato branch? I don't see anything about it on the blender website, and I am just wondering what it is, and is it legit?
It was the codename for the motion tracker project, which is now part of the main release.