It looks like you're using an ad blocker! I really need the income to keep this site running.
If you enjoy BlenderNation and you think it’s a valuable resource to the Blender community, please take a moment to read how you can support BlenderNation.

A simple way to create lip-sync animations

7

James Campbell outlines an effective method for creating lip-sync animations.

James Campbell writes:

Anyone who has worked on 3d talking characters knows that it is no simple task. So why not make the mouth completely 2D?

This can simply be added as a movie texture map to the head of your character.Creating the mouth animation can be done in a number of ways. The easiest method is using an existing2D animation program like Anime Studio or Toonboom studio as they have an automatic lip sync feature.You just draw the basic mouth shapes (usually about 10 drawings) and run the automatic lip sync andit is done in seconds.

In this book promotion video Anime Studio was used. Next export a Quicktimemovie using the animation codec as it has an option for including the alpha.Now simply apply the movie as a texture map in Blender. In the texture panel ensure the Auto refreshbutton is ticked and the lip sync will update in the viewport as you scrub the timeline. Also note sometime you may have to press the "Match Movie Length" button otherwise the animation may notdisplay the full duration.

Other alternatives for creating the lip sync that are totally free is to use Papagayo, Yolo orJlipsync. An Addon exists that can import the exported data files from these programs into blender. Here is the link.

This data can be applied to a 3d mouth with phonemes or you could make a long strip of geometerymapped with the phonemes and have the shape key centre on the screen the correct phoneme and renderthis out as a movie file.

Share.
  • FluxCapacitance

    Nice. I always wanted to integrate Anime Studio's functionality in this way with Blender. It's been sitting on the shelf for years.

  • Lawrence D’Oliveiro

    How about the digital analogue of Supermarionation!

    Anybody who remembers the old Gerry & Sylvia Anderson puppet productions like “Thunderbirds” will remember the mention of “Supermarionation” in the credits. This was an electromechanical system that automatically triggered the puppets’ mouths based on the audio signal from the soundtrack. Surely we could do something similar in Blender?

  • Eli Spizzichino

    Unfortunatly on linux 64 bit they are not viable options:
    Papagayo have _lm.so binary compiled on 32 bit and no way to recreate it from the provided source.
    Jlipsync have the distributed jar corrupted (not tried yet to recompile it)
    Yolo (very limited and simple program), the audio muted after the second playback but at least exported my simple sentence as moho.
    However once imported with the addon it keyframed every frame making the passage from the shapekeys to rapid and the result was a mouth barely moving.
    I've seen good result using the tracker with a recorded video it seems the most rapid and smart way.

    • Benjamin Lau

      A version of Papagayo which doesn't depend on the original Lost Marble _lm.so is available here: http://code.google.com/p/papagayo/

      • Eli Spizzichino

        thanks I was not aware of it.

        I've tried, the text input panel is collapsed but you can still paste in it, besides the phonemes breakdown are little to generic it's a starting point.

        Thanks for letting me know

  • The FRED FONG

    Amazing

  • Bruno

    "make a long strip of geometerymapped with the phonemes and have the shape key centre on the screen the correct phoneme"

    I'm trying this way, but couldn't figure the "shape key centre on the screen the correct phoneme" out. Could someone explain it a litle more?

    Thanks!

Share.