Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Lip Syncing in Unity?

Discussion in 'Formats & External Tools' started by Modifier, May 13, 2011.

  1. Modifier

    Modifier

    Joined:
    Sep 13, 2009
    Posts:
    41
    Hey all,

    Are there any sort of real-time lip syncing tool that anyone can recommend? I would like to pass an audio file to my character and have the character's facial expression and lips move.

    Thank you.
     
  2. ViperCode_

    ViperCode_

    Joined:
    Apr 11, 2011
    Posts:
    19
    People can barely do this, a program that would take audiotransfer it to text then be able to know the phonics of the words and mouth them correctly is unlikely.

    (just a note: I could be wrong about this so don't base off my answer)
     
    Last edited: May 14, 2011
  3. ron333

    ron333

    Joined:
    Jan 3, 2009
    Posts:
    47
    You can do lip synching and facial expressions in Blender. Use MetaMorph to bring the Blender animations into Unity.
    MetaMorph is available on the Unity Asset Store.
     
  4. Modifier

    Modifier

    Joined:
    Sep 13, 2009
    Posts:
    41
    What I mean by real-time is a user types in a text box, hits ENTER, and the text is passed to be turned into audio in milliseconds. Sorry if my first post was not completely clear.

    @ViperCode - Actually, the text is turned into audio, which is already working fine. Then that audio is sent to the character for lip syncing. There are many 2D chat boxes that have been doing this for a while in Flash, but I'd like to have it work in a 3d environment. FYI...the Flash apps are real-time.

    @ron333 - I need real-time. Sure 3ds Max, Maya, Blender, and such can do lip syncing...but I want it to pass any audio source and watch the lips move in real-time.
     
  5. makan

    makan

    Joined:
    Jan 8, 2011
    Posts:
    342
  6. makan

    makan

    Joined:
    Jan 8, 2011
    Posts:
    342
    Or check this one out smithmicro.com
     
    Last edited: May 16, 2011
  7. ChaosWWW

    ChaosWWW

    Joined:
    Nov 25, 2009
    Posts:
    470
  8. GisleAune

    GisleAune

    Joined:
    May 16, 2011
    Posts:
    88
    I think FaceFX there is what Bioware used for DragoN Age: Origins. I have played around in the game's mod toolset and lyp-sync sequences are not rendered at realtime. I.e not what OP seeks.
     
  9. foolish-frost

    foolish-frost

    Joined:
    Mar 18, 2009
    Posts:
    169
    I researched this for days. It does not exist in any usable form I could find.

    At some point, you are going to need to take the audio, and translate it into mouth shapes. it MAY be possible to do this from recordings, but in most cases, it is puppet talking: the mouth reacting to volume levels.

    None of this is live. Most at best have a file that is synched to the sound file, and moves the mouth based on it. Unity does not have the ability to read sound data from a wave as of my knowledge.

    If you do find a solution, let us know. We're still looking.
     
  10. Modifier

    Modifier

    Joined:
    Sep 13, 2009
    Posts:
    41
    Thanks meh11, I'm gonna spend a bit of time tomorrow and read the info on that site carefully. As with FaceFX, they admitted they could not yet produce what I want to accomplish.
     
  11. yogibear

    yogibear

    Joined:
    Jan 10, 2009
    Posts:
    26
    I was able to create a quick and dirty version of Talking Tom using certain tools. Here's what I did.
    1. Record and save audio using FMOD plugin.
    2. Convert the audio WAV file to a set of phonemes (using Quick n Dirty Phoneme Extractor - Google it)
    3. Map the phonemes to any viseme set that you can find on the net)
    4. I rendered 3D characters to 2D Frames and got quite realistic results

    Hope this helps

    -Raj
     
  12. AnomalusUndrdog

    AnomalusUndrdog

    Joined:
    Jul 3, 2009
    Posts:
    1,551
    That's really quite a tall order. I've always wanted that in RPG games, when I type my custom character name, the NPC characters who have voice acting should say my character name, but so far, even in triple-A games like Dragon Age, that doesn't happen.
     
  13. DaveA

    DaveA

    Joined:
    Apr 15, 2009
    Posts:
    310
  14. i3DTutorials

    i3DTutorials

    Joined:
    Aug 26, 2010
    Posts:
    564
    Facerobot, Softimage; done deal.
     
  15. maewionn

    maewionn

    Joined:
    Jan 18, 2016
    Posts:
    37
  16. markvi

    markvi

    Joined:
    Oct 31, 2016
    Posts:
    118
    I haven't tried it, but this is a plugin for importing Rhubarb lip sync into a Timeline track. Rhubarb is only for 2D animation. You could probably hack something together that maps mouth shapes to blendshape weights, but it's not a trivial thing.

    The Unity Face Capture app for iOS captures lip sync, but ARKit isn't great with lipsync.

    Oculus has a lipsync solution for Unity.

    Salsa Lipsync on the Asset Store is rated quite highly.

    This appears to be an open source lipsync project that analyzes audio in realtime using Burst.