Tomato tracker Sergey Sharybin shows a glimpse of a new feature for the tracker: object tracking!
You're blocking ads, which pay for BlenderNation. Read about other ways to support us.
Tomato tracker Sergey Sharybin shows a glimpse of a new feature for the tracker: object tracking!
that is really cool!
this is the best birthday gift!
I wonder if there will ever be geometry based tracking in Blender, similar to the way pftrack does it?
Happy birthday, blazraidr !
My girlfriend has birthday today as well. Funny.
And that tracking is really, REALLY cool!
I'm the primary libmv developer. While geometry tracking won't get added for awhile, tracking based on geometry is on my list. Before it can get added, the least squares module in libmv needs to get replaced, which won't happen for a few months. Also, a fair amount of blender UI work is necessary. How important is geometry tracking in your experience?
First of all thanks to you for awesome libmv! Geometry based tracking is very important for object tracking, Most of all object tracking tasks are based on tracking marked object with known parametes(geometry)
I mean, if one does alot of object tracking, he is able to provide solver with a geometry to get clearer and preciser solve.
Well, with my experience in testing out geometry tracking, it provided a quick and solid track, without the necessity of several tracking markers. I think it will allow quicker object tracks to be made, since right now, it is quite tedious to individually track and manage the track markers in Blender. They do quite often jitter off their point of interest, and need to be re-positioned several times throughout the course of the track. An "automatic" button, similar to Syntheyes, would suffice, though I am not aware of the technical side of things required for this, so I am not sure if it is easy or hard to implement.
While I am extremely astonished by the development of Blender throughout the past years, if geometry tracking were to be developed and inbuilt into Blender, it would become one of the only two(?) applications to have a geometry tracking feature, with the bonus that it is free. I wouldn't say it is completely required in Blender, though if you can develop it, then why not? It will also open up the eyes of several industries to give Blender a chance, getting over the free and open-source stereotype.
These are just my thoughts at the moment
Thanks for your detailed comment. What you say makes sense; that geometry tracking is important for reliable object tracking, since for object tracking there are generally fewer trackers for the solver to work with.
In conclusion, geometry tracking is on the list but won't be feasible for several months since we need to replace the refinement solver in libmv with something more general first.
can this be used for full body motion capture .... not real time of course
This is not true, unfortunately. Structure from motion, which is the algorithm used in libmv which is used for Blender's motion tracking, relies on rigid motion. This means that non-deforming objects such as cars, trucks, buildings, airplanes, etc track well. Humans are not rigid, and so tracking doesn't work.
Handling motion capture requires a separate set of routines, and at least 1 witness camera in addition to the primary camera.
Out of curiousity: how is oclusion of markers handled in motion capture?
It depends how the system is implemented, but I imagine occluded markers don't get constraints added for the frames they are occluded, which may lead to a less accurate solution.
the soft for Motion Analysis cameras handles discontinuing tracks in several ways. for rigid bodies they can be calculated, for non-rigid you select missing segment and click on a button which tries to interpolate the coordinates. there are different algorithms for the buttons. slight automation was possible.
How about attaching rigid markers on a person's legs/arms/etc and tracking them separately?
You'd have to have lots of markers on each limb. It's unlikely you could get a good track this way.
Oooooh, this is going to be so much fun!
This is awesome!!! Do you think it will be ready before the final release of 2.61?
I'm trying to convince Sergey to get this in for 2.61!
Freakin awesome ???
yep, giorgio , this is freakin awesome ;-)
Awesome!!!!!! When will it be in any build of Blender?
Oh my god, Sergey you are just awesome!
Yes! I have been waiting for object tracking since I first saw that Blender was getting camera tracking. Can't wait till it is out for use.
Esta muy cool
Awesome job. :D Yeap. This was needed. If there's time and interest, I think this might be something to consider too now that Blender has motion tracking. Though multiple cameras would be almost a necessity for quality captures. http://www.youtube.com/watch?v=vn8aLUB7koQ
Multiple camera support isn't in yet and won't make it for 2.61.
There's no hurry really. XD
I thought multicamera wasn't even planned to tell the truth. I don't know half what's planned to tell the truth though I tend to read the news here pretty often. If you make it support multiple cameras however, MoCap support (or however it's typed), could be a real treat for some people. I can't afford any camera rigs myself, I'll just settle with some basic facial following techniques from one angle.
Facial controls might actually work already with just one camera. If the dots can move independently a bit and still keep track of the position of my head, I think I might use the object movement to compensate my head movement and get normalised facial controls. Well, one can hope. I'll dive into the python more deeply on the motion tracking side a bit later. Think this might work or at least be worth some experimenting?
Yep. This is going to make not upgrading my Syntheyes license a good choice. This is so AMAZING!
To be fair, the tracking support in Blender is far from what SynthEyes offers. The basics work but there are many other parts of SynthEyes that Blender doesn't handle now. I'd prefer to not oversell the Blender motion tracker, since it is not as good as the commercial tools yet.
Could you or Sergey give some hints about the tracking settings used like focal length, undistortion etc? I've been trying to track the same plates but I can't get a good solution. All the tracks have high solve error even though they look rock solid on the footage and the camera constraint is jumping all over the viewport like a frog in a frying pan!
Also, the tracker says it needs 8 markers on each keyframe in order to solve the camera (I'm assuming that's the same markers). How does it handle frames where the keyed markers all go out of view?
Object tracking is tricky because the trackers take up so much less of the frame, which makes solving less stable. The best guidance I can give is that you should try a a couple different settings for the two keyframes, and try to get the initial camera intrinsics as close as possible. There isn't any autocalibration yet, only intrinsic refinement, so you still have to have a decent guess as to the focal length, center of projection, etc.
this is so awesome. I agree though that we need some kind of body/face tracker thing. like an object tracker that can deform its mesh
It's on the list of things to add, but this won't happen for months.
These really are impressive developments greatly extending Blender's capabilities
So...we can officially use Blender to create lightsabers in the next Star Wars movie? :)
I can't believe how quickly the good news keeps on coming from you guys! Thanks so much!
To Sergey and Keir, thank you guys so much for you work! My brain has been going crazy with new creative possibilities that you have opened up to us all. I did my first camera track yesterday (faux-object tracking) with a webcam and a plate covered in markers. It was amazing to see it all work so well!
How did that work? I thought the tracker didn't have any compensation tools for working with rolling shutter footage.
Compressed footage (let's say good quality youtube stuff) or deinterlaced footage from an interlaced camera gave me
a lot of trouble. It surprises me that webcam footage works.
The website http://www.hollywoodcamerawork.us/ is a really good starting point to get good quality footage to practice this stuff. You can for instance download a lot of green screen plates to practice compositing and camera tracking. Every set of plates has suggestions for exercises with it and sometimes even finished CG renderings that you can use in the composite. I messed around with a lot of bad footage before finding these guys!
Really interesting feature ! However, if I understand well what I see, the markers (trackers targets) have to be inside the volume of the 3D object that will be added (like this big grey weapon). If I want to add a realistic small weapon or an other small object, I will have to erase the markers and reconstruct the character frame by frame at markers locations (by texture cloning/painting) . Right ?
That's correct. There are two competing constraints for object tracking: (1) having sufficiently many trackers that solving the object motion is reliable and accurate and (2) not having so many markers that important parts of the scene are occluded.
Currently there is no facility in Blender to aid in the task of reconstructing occluded stuff, besides perhaps the UV project code.
is it possible to develop this Warp Stabilzer feature with the help of the tracker feature in Blender?
Sergey, The camera tracker was amazing. So much more than I expected and now you are giving us more goodies! Thank you so much! I am going out to my shop to make some props so I will be ready.
Hi, I have an issue. With the latest release I can't do tracking anymore. The movie will not play. It worked in an earlier version. Does anyone has an answer?