Advertisement

You're blocking ads, which pay for BlenderNation. Read about other ways to support us.

Mango: Digital Makeup, Part 2: Face Deform Test

18

After I published the 'Digital Makeup' video by Sebastian König this week, people were quick to point out this approach didn't work for expressions. Well, no more - Sebastian figured it out.

Sebastian writes:

Another test of facial motion capture with Blender. I didn't tweak a lot on material or lighting, so that could be improved. But the basic concept works, I think, even though it is a rather tedious workflow. But it's quite good and can make very advanced stuff possible.

There is a little bit of sliding, which would have to be improved too, either with more markers, or by adding some manual shape keys.

But as a start I think this technique is really helpful.

That head tracking device made tracking a LOT easier, because it adds a lot of perspective, but of course this would have to be painted out in a real production. Or hidden with hair/helmet/horns/wig or whatever. :)
All in all this took me 4 hours.

About the Author

Avatar image for Bart Veldhuizen
Bart Veldhuizen

I have a LONG history with Blender - I wrote some of the earliest Blender tutorials, worked for Not a Number and helped run the crowdfunding campaign that open sourced Blender (the first one on the internet!). I founded BlenderNation in 2006 and have been editing it every single day since then ;-) I also run the Blender Artists forum and I'm Head of Community at Sketchfab.

18 Comments

  1. Dude! You coulda washed your face before shooting this video!

    Dunno if you've noticed, but you've lost an eye ball : For a guy who work with 3D it's not very pro....

    I'm kinda desapointed by such carelessness...

  2. Alexander Weide on

    I cannot believe what you can reach with Blender. You have to show me how it works;)^^xDD.. My VFX Blender Shot will be released in couple of weeks^^ then everyone could see how much more is possible

  3. :) Sebastian was clearly looking for his lost eye ball in this vid. Was it stuck to his 'high tech' tracking device? Do we all have to give up an ey ball or some of our limbs to do proper tracking? And not wash our faces for a few days?

    LOOKNG GOOD! Awesome work!

  4. Well that's both outstanding and disgusting at the same time! People like you make me want to throw away my digital equipment and go be a farmer. Cause I'll never be the rockstar you folks are.

  5. Is it possible to mo-cap somebody (or multiple persons) with this method or a similar method and then use the data to animate within Blender?

  6. Impressive, but I certainly wouldn't waste time doing something as simple as dirt digitally. The eyeball removal obviously CAN'T be done any other way. Before you set your mind on doing your characters this way, you better do a test that's a lot closer, and has much greater variety of facial expressions. Any mismatch in the movement between the model and the real actor will become much more obvious in close-ups.  I don't think it's really an option to just avoid close-ups; they're a fundamental part of the language of film, and it'll be really obvious if they're missing.

    • Depends on the tracker size and search area size, I guess. I don't think there is currently a program that does that. Certainly, the blender workflow doesn't allow for it.

Leave A Reply

To add a profile picture to your message, register your email address with Gravatar.com. To protect your email address, create an account on BlenderNation and log in when posting a message.

Advertisement

×