After I published the 'Digital Makeup' video by Sebastian König this week, people were quick to point out this approach didn't work for expressions. Well, no more - Sebastian figured it out.
Sebastian writes:
Another test of facial motion capture with Blender. I didn't tweak a lot on material or lighting, so that could be improved. But the basic concept works, I think, even though it is a rather tedious workflow. But it's quite good and can make very advanced stuff possible.
There is a little bit of sliding, which would have to be improved too, either with more markers, or by adding some manual shape keys.
But as a start I think this technique is really helpful.
That head tracking device made tracking a LOT easier, because it adds a lot of perspective, but of course this would have to be painted out in a real production. Or hidden with hair/helmet/horns/wig or whatever. :)
All in all this took me 4 hours.
18 Comments
Dude! You coulda washed your face before shooting this video!
Dunno if you've noticed, but you've lost an eye ball : For a guy who work with 3D it's not very pro....
I'm kinda desapointed by such carelessness...
not sure if joking or actually stupid.. ;D
I cannot believe what you can reach with Blender. You have to show me how it works;)^^xDD.. My VFX Blender Shot will be released in couple of weeks^^ then everyone could see how much more is possible
:) Sebastian was clearly looking for his lost eye ball in this vid. Was it stuck to his 'high tech' tracking device? Do we all have to give up an ey ball or some of our limbs to do proper tracking? And not wash our faces for a few days?
LOOKNG GOOD! Awesome work!
I reckon a crown would have been better than the tiara.
what equipment do you use for shooting?
I think that in another video on his vimeo page, Sebastian mentions that he uses a canon D550.
Me want!
Well that's both outstanding and disgusting at the same time! People like you make me want to throw away my digital equipment and go be a farmer. Cause I'll never be the rockstar you folks are.
Put a fire at the tracking device and you can do the tracking in dark, very good.
It's all fun and games until someone loses an eye
Is it possible to mo-cap somebody (or multiple persons) with this method or a similar method and then use the data to animate within Blender?
But how do you handle hair, for example bangs over the forehead. Should I make a shell to paint the hair on?
Impressive, but I certainly wouldn't waste time doing something as simple as dirt digitally. The eyeball removal obviously CAN'T be done any other way. Before you set your mind on doing your characters this way, you better do a test that's a lot closer, and has much greater variety of facial expressions. Any mismatch in the movement between the model and the real actor will become much more obvious in close-ups. I don't think it's really an option to just avoid close-ups; they're a fundamental part of the language of film, and it'll be really obvious if they're missing.
Now open your mouth!
Is libmv fast enough to do real-time tracking?
Depends on the tracker size and search area size, I guess. I don't think there is currently a program that does that. Certainly, the blender workflow doesn't allow for it.
Nice