Advertisement

You're blocking ads, which pay for BlenderNation. Read about other ways to support us.

Demo of tracking workflow in Blender

66

Sebastian König gives a great 7-minute demo of the upcoming camera tracker functionality in the Google Summer of Code project.

Sebastian writes:

Here's a quick demo of the tracking workflow in Blender, as it was this morning. But it already got better by now. :D Hard to keep up with development. This is becoming a serious alternative to other (commercial) trackers. When GSOC is over, we've got a mature tracker right there.

About the Author

Avatar image for Bart Veldhuizen
Bart Veldhuizen

I have a LONG history with Blender - I wrote some of the earliest Blender tutorials, worked for Not a Number and helped run the crowdfunding campaign that open sourced Blender (the first one on the internet!). I founded BlenderNation in 2006 and have been editing it every single day since then ;-) I also run the Blender Artists forum and I'm Head of Community at Sketchfab.

66 Comments

  1. Rick de Wolf on

    This is great! You only need some more vfx stuff right inside blender, and have some patience for the gsoc to finish. After that, there's no need for any commercial programs any more!

  2. Aw, c'mon! What's a great 3D package without integrated camera tracking functionality? ;) BLENDER IS AWESOME!

  3. This is totally awesome! I'm looking forward to replacing After Effects with Blender and then telling all my friends about it. Blender has a bright future.

  4. We really need this ASAP ! it's like WOW I could do unbelievable things with that !
    These guys are gods.. they rocks !

  5. This look very promising. I guess when GSOC is over, Blender will be one of the greatest graphic software in the world!

  6. @Sebastian
    If you read this: Please consider to make a tutorial about how 3D-objects can cast shadows onto the 2D-footage (like Suzanne casting a shadow on the box)...
    I couldn't quite follow up the video :P

    And the demo was truly awesome by the way ;)

  7. Magiciandude on

    @TLOZ:
    He modeled the box in 3D purely for the purpose of getting the shadow data from Suzanne...then he split the scene into render layers...one with the box with the shadow data and one with just Suzanne. He then multiplied the shadows over the footage and placed Suzanne over that in the compositor.

    These tracking features are looking incredible! Soon I might be able to replace Nuke with Blender :D

  8. One Question with buffles me: How to calculate the correct camera settings (real and blender intern) I know the 38mm equivalent focal lenght of my footage is 29,8mm when fully zoomed out.

    How do I feed this information...

    1. into the tracker (there are two field where I can enter a length value and I'm a little bit irritated about this ;-)
    3. how to calculate the value for the actual blender camera after tracking (Sebastian caluclates something like 32/(lower value from 1.)*(upper value from 1.) )

    additional to this: this is really great work

  9. @TLOZ:
    Exactly as Magiciandude says. And if you want to know more about that, I have done a 5 hour video tutorial just about that. :) http://cmivfx.com/tutorials/view/255/Blender+3D+Compositing

    @Mannheimer:
    Since later today the blender camera focal length will be calculated automatically.
    The formula I used was based on the fact, that currently blender's backplate size is 32mm by default. So if your camera has a different sensor size (which most certainly is the case), then you have to compensate for that: 32/(your sensor size) * (your focal length). But as I've said, that should be automatic now.
    In the camera data panel you enter your sensor width and the focal length. (the Canon 550d/T2i has a sensor size of 22.3mm, afaik)

  10. You guys are just awsome....

    Because of Blender ( and especially its renewed GUI ) i dropped 3dmax 6 month before.

    And it is just an installation file under 50 mb.

    And it will be better and better.....

    Open Source will be the future.

    Great!

  11. @sebastian_k

    Now it's clear to me :)
    Thanks for that quick answer and also for sharing this video.

    Grüße aus Konstanz
    (far from home ;-) )

  12. lets hope this project will make it into Blender and not get lost with so many other previous projects.

    this looks very professional and I agree with Sebastion_K

  13. Very cool! Really coming together. Curious about automatic tracking and tools to help with motion blur, rolling shutter and high frequency noise. I'm sure all that will be there in the future. Plus survey data tools, lens distortion workflow and automatic object tracking.

    Great work! I'm very impressed.

    --

  14. Sergey, you are a genius. Ton you are a genius. Libmv team, I thank you for all of the previous work you did on Libmv. You are also geniuses. Blender is so good for a free program most people don't believe me when I tell them it is free. :D Keep strong! God Bless.

  15. This is truly amazing. I can't wait for that timelapse to be turned into a tutorial.
    Maybe the guys over at BlenderCookie should get the tomato branch!

  16. Yes it looks great and is scary but without a manual and really easy users guide it won't get used. I don' t understand what I 've already got and can't use it. I forget the what buttons mean what and become more and more easily confused and give up. The whole creative process stalls. If you can think all those idears through I hope there is no confusing options or inexplicable deadends. It all looks a bit gothic and customers will turn away because it can't provide what it claimed to them. Nothing stalls creative thinking like being told "you can,t do it" and the machine stoping.

  17. Your work is awesome! I'm also a GSoC student (GIMP) and seeing your project I'm amazed by the amount of work you managed to push before the midterm. Keep up the magnificent work ;) Very useful indeed, I needed stuff like that.

  18. Each time I look to new Blender news and feature, I have to use the same standard word : WOW !

    This will really be a great improvement. Currently I use a separate software for camera tracking, but I use a very old version (2.43) of Blender for matchmoving, because the exporter script is hold. As I have not done tracking for a while, I must admit that I haven't checked recently if a new exporter is available, but having this feature inside Blender will solve many compatibility problems !

    Great work guys ! Congratulations, and a big Thank You !

  19. BlenderManiac on

    My Blender crashed often when i manually tracked 20 sec movie 640x480
    My AE crashed often when i make my slowmo 1280x720 60 sec project.
    Nothing is perfect, but i never believe that will be 2D/3D tracer for Blender, especially in Blender
    But some day, if ill have kids, they will love vegetables,especially "TOMATO"s.
    Thanks for all that work on that project, and those that share this with us.

  20. Sebastian Why are you disabling all the comments on your vids?
    If you hadn't done that I would have spammed you with how good you are.

  21. I just installed the latest tomate build from graphicall.org and it's quite tricky to get the trackers to stick.
    You need surfaces with quite some texture apparently. I tried a bunch of different clips and I'm having some
    trouble. Also, it only wants to play the clip when you just added a new trackers and it doesn't want to jump
    back to the beginning. I'm sure it's because I don't know what I'm doing and I'm not using the same build
    as the one in the timelapse. But it's clear that it's going some where.

    Update: I was being stupid. The play button the the tracking tool panel does not actually play. You click it and then play the clip in the timeline to check the result.

  22. So, back with an update. I really should be sleeping right now, but I got a succesfull camera track!
    I'm quite pleased with myself, but even more with the Tomato team at GSoC! Thanks guys!

  23. The most impressive thing I've seen on Blender, which continues to amaze. I have the branch checked out. Incredible work.

    The video is also a great tutorial.

  24. As Vegeta once said in the painfully-hilarious parody DBZ Abridged (created by Team Four Star):

    "F-------------------------------------------------------CK!"

    Kudos to the Blender developers who apparently train their awesomeness under 100 times normal gravity.

  25. I'm the main libmv developer. I'm glad everyone is so excited! I just wanted to add, that we are going to make many of the things in this video automatic with time (e.g. keyframe selection, tracker selection, etc), but we are getting the core stuff working first.

    @Flowers:

    We have lots of plans, but since there isn't much coding time, we're prioritizing the import stuff (like basic solving works) first.

    About your specific points; I have some further questions:

    - motion blur: No plans yet. Do you mean handling shots with high motion blur? It's a tricky problem; we haven't thought about it yet.
    - rolling shutter: Yes, we're aware of this and have plans to compensate for it, but this won't come for awhile.
    - high frequency noise: What do you mean by this?
    - survey data tools: what are survey data tools?
    - lens distortion workflow: Yes, this is important and we will have to support this. It's on the list.
    - automatic object tracking: What do you mean by this? Note that a moving camera with a static scene is the same as a moving object with a static camera.

    I'll keep paying attention to this thread, so if other people have questions I'll respond.

    Keir

  26. Absolutely incredible stuff! Not only the tracking but how fast it is implemented in Blender!

    Can it track moving objects in a static scene? I was wondering that if there would be a video of a person doing something and the camera would be slighly moving, adding tracking points to specific parts of the body may result as base for bones. So basically very cheap motion capturing

  27. Freaking awesome!

    Haven't really played with tracking since school, remember manually measuring points on floor/wall and entering coordinates into max. This is sooo much better.

    This is turning out to be a great summer for blender :)

  28. @Hubberthus

    Rigid objects moving in a static scene is the same as a static scene with a moving camera from the perspective of libmv, so yes, that should work (and should work today). However, if the object is deforming, like a person, then it won't work.

  29. @Keir

    About the motion tracking. I would be awesome if one could define track groups and then have the camera solver operate
    on a specific group. That way, one can get a camera solution from one set of trackers (placed on static objects) and another
    for motion tracking.

    Also, I have a question. I managed to get a camera solution from a set of about 15 tracks, however once one of the tracks
    gets lost, the tracking stops. If I want a camera solution for longer, am I just supposed to add keyframes?

    Thanks!

  30. @Fish,

    Ok, so if a track get's lost, I can fix it. But what if the tracker is gone from the field of view or gets obscured?

  31. Just to make things clear,
    is this a Blender interface to a commercial/proprietary plugin (e.g. Syntheyes), or are you developing a complete free (as in freedom) camera tracking software for use within Blender?

  32. Hello! Could someon build the latest Tomato Branch for Win 32, please? There is only Win 64 version, that is updated frequently. I would love to use even this test build for production (I'm in the middle of a small movie project, camera tracking inside Blender would be a great addon for me).

  33. @asd: its completey free inside blender based on the free and opensource libmv

    @PhysicsGuy then there is no hope... so both keyframes should have 10 markers...

  34. Hi, Congratulations again.

    As I haven't yet tried any build with this amazing feature, I'd want to ask a question :

    The trackers are converted in a cloud of dots. Are these dots vertices of a Blender Object ? I dont say "mesh" because there are no edges or faces, but it can be considered as a mesh as well.

    I ask that because SynthEyes export script generates a cloud of "Empties" that is not handy at all to use to re-create the 3D geometry of things. So , I use a script made by a friend (David Bertin, AKA "GFA MAD"), to convert the bunch of Empties into a vertex cloud belonging to the same Blender object.

    Vertices can be connected by edges and faces and they are ready to use.

  35. Jonatas Kerr de Oliveira on

    Awesome! Congratz to all the developers (tomato and libmv)

    That's just what i'm waiting for! I'm doing a children music project integrating video and 3d animation....

    The following information is my shot to try to help in some way....

    Here's some important features for automatic 3d camera tracking:
    * Full automated one-click track and solve.* Full automated one-click track and solve - the software automatically, selects what points must be tracked, tracks them and ignore bad feature points, solve the camera and generate the camera in 3D view and the corresponding mesh points.
    * Masks to isolate moving objects - a moving person can screw camera solving
    * Lens distortion correction
    * What is every time most important is the ability to deal with stereographic data.
    (smart tracking of scenes with more than one camera showing the same subject. pftrack and syntheyes can do automatic correspondence of features within the two footages and promises half the time of tracking for this kind of tracking).

    * what would be great is the automatic creation of a textured mesh from a tracked video. But that's just a personal dream.... hehehe

    Here's some reference for what most tracking programs can do:
    http://www.ssontech.com/synsumm.htm
    http://www.thepixelfarm.co.uk/product.php?productId=PFTrack&content=Features

    If there's some need of testing, please contact me... (jonataskerr .at. yahoo .dot. com .dot. br). i've used Pftrack and Voodoo in some projects... maybe i can test with some footage i've already used i other softwares....

    And Thanks again! you guys are great!

  36. hello, fellow blenderheads!

    I would like to thank the "Tomato" team for the amazing work they are doing with the tracking system.
    I have tried it yesterday with the r38329 for win64. I tried to follow the video and eveything went well. After solving, when i pressed Alt+A to see the blender camera movement, the camera was acting crazy, turning quickly from one side to another and going everywhere within the 3D view window... I'm shure that was my fault but i don't seem to find out why.
    Another question that i have is about the "Reconstruction" Keyframe 1:1 and 2:2. I don't understand what is it for. The last question is about the Camera Data. I don't know what data i should put in the sensor width and in the focal lenght. I don't own a video camera, i own a HP Cybershot digital Photographic Camera that i use to make basic home videos. I was hoping to use it to try to use to footage and experiment more with this build. In front of the camera are some numbers: "2,8-5,8/5,35-21,4". Is one of these numbers the data that i need to place in the Camera Data tabs? I know people like Mannheimer already palced question related to this but i really don't know anything about Focal lenght or sensor sizes... sorry.

    If some one could help me, i would be much apriciated.

    once again, thank you team! Great work.

  37. I wanna make a pterodactyl flying out from my monitor and smacking into the window...

    I feel like a kid with legos, mixed with a professional using legitimate tools.
    Blender seems to be constantly aiding this feeling,
    what do you call that feeling? oh yeah,...awesome.

    *thanks!*

  38. @roubal the latest revision has a button "convert bundles to mesh" and it converts bundles to a vertex cloud :-)

    @damrs.3d for reconstruction you need 2 keyframes with enough "difference" in depth so there is enough info for 3d reconstruction
    the latest tomatos need 2 numbers for the camera
    sensor size in mm
    focal length

    you can get those from the tech specs i guess

    hth
    http://vimeo.com/26388535

  39. Thank you, FISH.
    I spent all the afternoon tweeking the value of the tabs and stuff until i found out how (almost) everything worked.

    and

    thank you SEBASTIAN_K for the tutorial! When there is someone explaining what happens and why, everything becomes easier to understand.
    Thank you guys!!

  40. I'm going to have to give this a blast some time soon. Break out the DSLR, shoot some wobbly footage and then track it and ad a dragon or a giant spider

  41. @Keir Mierle

    survey tools

    Hello,

    if i remembered correct, survey tools are for the integration of measurements of real scene propertys to integrate them into the camera solve.

    For e.g. those 3 tracking points on the left wall share same height etc..

    Or maybe you have a box, building, some geometry in your scene with known dimensions you can track all visible corners and then give every single track the known data.

    And all those exact data will used to refine the estimated camera solve.

  42. I tried Tomato today and I was impressed how well everything went! This is much easier to use than many commercial program! Thankyou developers!!!

Leave A Reply

To add a profile picture to your message, register your email address with Gravatar.com. To protect your email address, create an account on BlenderNation and log in when posting a message.

Advertisement

×