Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Facetracking starter kit [Released]

Discussion in 'Assets and Asset Store' started by Synedge, Nov 21, 2012.

  1. Synedge

    Synedge

    Joined:
    Nov 21, 2012
    Posts:
    21
    Hi there!

    We just released a framework on the Asset Store that will let you integrate the Kinect(tm) Facetracking technology directly in Unity.
    Using the package you will be able to:
    - track the user head position and rotation (x,y,z coordinates relative to the device);
    - track a set of 6 (facial) animation units (read more);
    - receive the rbg and depth camera streams directly from the device.

    To demonstrate the technology we used the facetracking data to animate a high-poly mask of an Oni in real-time. The complete source code for the kinect(tm) integration is also included in the package.

    Check out our demo!
    Asset store link: http://u3d.as/content/synedge/facetracking-starter-kit/3EM
     
    Last edited: Nov 22, 2012
  2. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,307
    Wow, this looks pretty impressive. I don't have a Kinect, yet - let alone have it set up to work with my Mac ... but I certainly have bookmarked this and will buy as soon as I have a game where I could use it.

    One thing I noticed with your YouTube video: It doesn't have a link to the asset store or this forum posting. You might consider adding that so people who run into your demo video can easily navigate to the forum / asset store ;-)
     
  3. Synedge

    Synedge

    Joined:
    Nov 21, 2012
    Posts:
    21
    Thanks for the feedback and suggestion!
    I added those in the youtube video.
     
  4. I am da bawss

    I am da bawss

    Joined:
    Jun 2, 2011
    Posts:
    2,574
    Looks pretty awsome! But I don't have Kinect...
    So is it possible to make a webcam/mobile cam (front and back) version? I think you can reach out to more users this way.
     
    Last edited: Nov 23, 2012
  5. Synedge

    Synedge

    Joined:
    Nov 21, 2012
    Posts:
    21
    It is a good point, yes, and we are considering exploring this possibility.
    The motivation behind using Kinect though, is that the facetracking quality is much better as it uses both the depth and skeleton tracking information.
     
  6. VIC20

    VIC20

    Joined:
    Jan 19, 2008
    Posts:
    2,687
    I'm also interested in mobile support - mostly for head tracking, the face stuff is awesome but I see much more use for the head tracking.
     
  7. I am da bawss

    I am da bawss

    Joined:
    Jun 2, 2011
    Posts:
    2,574
    I remember a few years back there was this face tracking tech I saw called the "FaceAPI"





    http://www.seeingmachines.com/product/faceapi/

    When I saw your post I thought you guys are the same guys...:)
    EDIT: Apparently, Face tracking has become popular over the past few years... there is even an open source one called "FaceTrackNoIR" :
    http://facetracknoir.sourceforge.net/

    So I think webcam isn't impossible. The problem is the mobile, which is what I am most interested in - even with just rudimentary head tracking I think its good enough and hopefully more sophiscated face tracking would be avaliable later for mobile.

    EDIT 2 : Looks like even on mobile it isn't impossible - its already been done!

     
    Last edited: Nov 23, 2012
  8. xandeck

    xandeck

    Joined:
    Apr 2, 2009
    Posts:
    563
    Really nice, following this :)
     
  9. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,307
    Regarding FaceAPI, there's at least one entry on feedback.unity3d.com:

    http://feedback.unity3d.com/unity/all-categories/1/hot/active/augmented-reality-ar

    IIRC, there was also another one, which was more specific (I may have created that) - but since the Feedback site went to the new system, I got disconnected from all the ideas I had voted on / created ... so it's kind of difficult to find that one (if it exists at all - I just have a rather faint memory on that).
     
  10. Synedge

    Synedge

    Joined:
    Nov 21, 2012
    Posts:
    21
    FaceAPI was a strong contender when we were researching the technology out there.
    For us, the drawback was the licensing terms (each person buying our package would need to go to FaceAPI and get a license on their own terms). Of course there is a free license but that only allows to detect the head position, in a non-commercial application, and the whole source code must be released under GPL3 (which by itself would be against the EULA Unity's asset store is using).

    All in all, owning a Kinect seemed a much more reasonable alternative (from our point of view).
    We are also preparing to offer other NUI mechanims/features on the asset store which would also require a Kinect (or similar tracking technology).

    OpenCV might also be a worthy candidate for those whose requirements are only to track the user's head position.
     
  11. Synedge

    Synedge

    Joined:
    Nov 21, 2012
    Posts:
    21
    A quick update: some users reported bugs and crashes while using the kit. We just submitted a new version that fixes those problems to the asset store. It is currently under review and should be soon available.
     
  12. fineman

    fineman

    Joined:
    Mar 23, 2012
    Posts:
    19
    Hi - Nice work. Keen to try it. Am I right to assume that this uses the Microsoft Kinect for Windows SDK, so it will only run on Windows 7 or 8?
     
  13. Synedge

    Synedge

    Joined:
    Nov 21, 2012
    Posts:
    21
    You are correct, this uses the Kinect SDK and currently works on windows 7 and 8. It has been fully tested with both the kinect xbox 360 and kinect for pc. Note though that the installation of the SDK is necessary to use the kinect for xbox, whereas the redistributables are enough for the kinect for pc.
     
  14. imtrobin

    imtrobin

    Joined:
    Nov 30, 2009
    Posts:
    1,548
    Can u post an exe where we can test the face tracking accuracy?
     
  15. Synedge

    Synedge

    Joined:
    Nov 21, 2012
    Posts:
    21
    The accuracy is pretty much the same the one in the kinect samples, which you can try easily at http://www.microsoft.com/en-us/kinectforwindows/.
    It is their detection algorithms, although we did use a little bit of magic to animate the mask more gracefully.
     
  16. jessica1986

    jessica1986

    Joined:
    Feb 7, 2012
    Posts:
    621
    like i have a webcam, now cant i check this with some realtime demo ?
     
  17. Synedge

    Synedge

    Joined:
    Nov 21, 2012
    Posts:
    21
    Right, this package requires a Kinect device to work.
     
  18. imtrobin

    imtrobin

    Joined:
    Nov 30, 2009
    Posts:
    1,548
    Yes, I like to see the output of the magic you have put in. If it's only MS algorithm, I might as well use that. You have mobile, a apk?
     
  19. jpatinop80

    jpatinop80

    Joined:
    Jul 29, 2012
    Posts:
    55
    I need to know before buying if is compatible with mac os x unity.. very impressive by the way
     
  20. msl_manni

    msl_manni

    Joined:
    Jul 5, 2011
    Posts:
    272
    I have kinect for xbox360, will it work with your demo. Or do you have to have kinect for windows.
     
  21. Synedge

    Synedge

    Joined:
    Nov 21, 2012
    Posts:
    21
    Hi there.

    Yes, both kinect 360 and kinect for windows are supported.
    You do not need to install the sdk in the case of kinect for windows, but it is required for the 360 as this is not an official support from microsoft (they let it being used for development but not for deployment -- thus the need to install the sdk).

    As this software relies on the Microsoft Kinect API, it currently only works on Windows.
     
  22. TWELVE-STONES-GLOBAL

    TWELVE-STONES-GLOBAL

    Joined:
    Sep 28, 2012
    Posts:
    2
    Hello, I purchased the kit, and am able to see myself with the video and depth in the unity scene.
    Unfortunately, it's not yet connecting to the facial movements on the character.

    I've installed at the root level on a windows installation of unity4 and have SDK1.6 Kinect installed. :)
     
    Last edited: Jan 29, 2013
  23. Synedge

    Synedge

    Joined:
    Nov 21, 2012
    Posts:
    21
    Hi Jimergi,

    You are receiving the video and depth feed, great! This means the software is communicating properly.
    The fact that it is not tracking you could come from a multitude of problems.
    Given that you installed the sdk, would you mind trying the kinect samples and check that you can get tracked?

    Also would you mind sending me your complete specs by email? (I support you faster through emails).
     
  24. Xentar

    Xentar

    Joined:
    Apr 8, 2011
    Posts:
    23
    Synedge, sent you a pm in this forum and youtube, but you never answered me.
    This kit, can be used in 3.5 or use some feature of 4.0 so can't be ported?

    I own the 3.5 pro, and don't have intentions of migrate to 4.0 yet..
     
  25. TWELVE-STONES-GLOBAL

    TWELVE-STONES-GLOBAL

    Joined:
    Sep 28, 2012
    Posts:
    2

    The software isn't seeing the feed anymore. I am able to run all of the microsoft examples. it seems a file setup location. What is your email address?
     
  26. Synedge

    Synedge

    Joined:
    Nov 21, 2012
    Posts:
    21
    Just sent you a private message.
     
  27. ChaneDev

    ChaneDev

    Joined:
    Feb 12, 2013
    Posts:
    66
    Hi, thanks for the package, finally got it going. As it turns out I had to in case other people get stuck with the latest package:

    1. Unpack KinectDataTransmitter
    2. Dig down through the unpacked directory to (looks like I got a double directory) \KinectDataTransmitter\KinectDataTransmitter\bin\Release
    3. Copied the contents of that directory (all dll files .exe file)
    4. Created a folder called "Kinect" in my project root directory (noted in the documentation about the root part, but not really the folder name part) and copied contents of step 3 into this folder
     
  28. ChaneDev

    ChaneDev

    Joined:
    Feb 12, 2013
    Posts:
    66
    Hi I was wondering if I can control the Kinect pitch through your scripts? Are you planning on exposing more control over the Kinect in the future, is there documentation on what I can current do with the camera? I'm not advanced enough to compile the Kinect libs/code, but would love to drive some basic features from Unity.

    Thanks, having fun with the package!
     
  29. Synedge

    Synedge

    Joined:
    Nov 21, 2012
    Posts:
    21
    Hi!

    We currently do not support it but you could implement it very easily in the kinect bindings.
    As far as the Facetracking goes I don't think we are going to increase the features in the near future.
    We are currently developing support for the skeletal tracking and this should be released (in a separate package) sometime next month.

    I will keep you updated!
     
  30. Synedge

    Synedge

    Joined:
    Nov 21, 2012
    Posts:
    21
    Actually you should only need to unzip the KinectDataTransmitter in the right place... but it seems we made a mistake in the package creation.
    We submitted a new version but in the mean time please use this work around.

    Thank you for pointing it out.
     
  31. tonirojasm

    tonirojasm

    Joined:
    Apr 10, 2013
    Posts:
    6
    Is there any FaceTracking Demo or something fr testing? Thanks!
     
  32. Claudah

    Claudah

    Joined:
    Feb 20, 2013
    Posts:
    2
    Hi there,
    I am just about to buy your kit and I have several questions.
    My aim is to implemment the face-tracking in my multiplayer game. The characters are designed in Autodesk and they consist of ribs.
    I want them to be able to communicate with each other through a voice chat, and the facial expressions of each of them to be visible through the networking within the VR.
    That's why I need the plugin, but I am not really sure if that is going to work.
    My second question is whether you provide a video or pdf tutorial on how to install the plugin and map the code with characters dufferent than the mask which is provided.
    I will be very gratefull if you can clarify these issues for me,
    All the best
     
  33. Synedge

    Synedge

    Joined:
    Nov 21, 2012
    Posts:
    21
    @tonifuego: yes, you can of course download the "Kinect for Windows Developer Toolkit" which has at least 2 different facetracking demos.
    http://www.microsoft.com/en-us/kinectforwindows/develop/developer-downloads.aspx

    @Claudah: I also recommend you to install this toolkit and check the demos.
    While we can do better than what is being demo'ed, using some additional logic, it is very good to understand the limitations of the technology.
    We are currently developing a solution for a client that has requirements similar to yours.
    Our approach was to use the starter kit as a base, but also add an additional layer of lip-sync based on the voice to better control the lips.
    That also leads to another topic. If you read the kinect facetracking documentation, you will understand that the information you receive from the sdk is a couple of floating point values that tell you, for example, "the mouth is open at 75%" or "the eyebrows are raised at 60%.
    In the starter kit, we exaggerate some of these values (we re-map them) to create certain stylistic effects. But the truth is you will probably want to re-map these values anyway as they might hardly correspond to your expectations.
    To better understand this topic, I recommend you to study in depth the documentation and samples that come along the kinect sdk. Maybe try to modify one of them to match what you would like in your game.
    Once you understood that part, using the kit in Unity is the easiest part -- it already works out of the box.
     
  34. siberman

    siberman

    Joined:
    Oct 12, 2011
    Posts:
    31
    Hi there,

    I purchased your kit yesterday and i've had some good results once the tracking begins, but holding the track seems to be intermittent. I'm getting long lags between frame updates on the depth and rgb monitor planes, and an average of c.7fps for the kinect.
    It also seems that it won't pick up the face without the whole skeleton in frame.

    I'd really appreciate some tips on how to get around these issues.

    Thanks.
     
  35. Synedge

    Synedge

    Joined:
    Nov 21, 2012
    Posts:
    21
    Hi!

    I would recommend updating at a lower resolution or updating it using low-level commands.
    As far as I could investigate, the Texture2D.SetApply() was killing performances...
    This post might help you getting started with that: http://forum.unity3d.com/threads/3888-Playing-QuickTime-to-a-texture/page2?p=29315#post29315

    While you don't need the *whole* skeleton, the upper body might be necessary yes.
    This is the way microsoft is doing its own tracking (i guess it uses the depth frame and the skeleton information to enhance its analysis and reduce the calculation necessary to detect certain features in the image).
    There is no work around that I know for this, but maybe I missed it in the documentation.
    Just to be sure I would recommend to start reading it here:
    http://msdn.microsoft.com/en-us/library/jj130970.aspx
    (also note the minimum specs for the tracking itself -- add some more on top of that for the unity part).

    If you wish to tweak the camera rgb+depth resolution, you can take a look at the KinectDataTransmitter setup.

    Best regards,
    Eurico
     
  36. siberman

    siberman

    Joined:
    Oct 12, 2011
    Posts:
    31
    Thanks Synedge,

    I'm managing the lighting now and getting good results.

    A couple of other things,

    The tracking glitches out when there is more than one face in view, is it possible to pass face id numbers or something for isolation purposes?

    I'm occasionally getting this error

    "The kinect sensor is changed to status: SensorNotGenuine"

    Not sure what this means, a restart fixes but i need this to run for extended periods of time.
    Any suggestions?

    I also added a delegate to the KinectBinder called Tracking Lost which fires when framerate == 0, this is the only mod i've made.

    Cheers.
     
  37. Synedge

    Synedge

    Joined:
    Nov 21, 2012
    Posts:
    21
    Yes of course. You could for instance extend the KinectDataTransmitter. Go to the DataConverter project and open the Converter.cs. Look for the EncodeFaceTrackingData() and add the parameter. Take care of also Changing the DecodeFaceTrackingData().


    None really, it seems weird. The only piece of documentation I could find is this:
    http://msdn.microsoft.com/en-us/library/microsoft.kinect.kinectstatus.aspx
    http://msdn.microsoft.com/en-us/library/hh855357.aspx (the bottom part)

    Maybe you should consider contacting Microsoft directly...
     
  38. justin_iSO

    justin_iSO

    Joined:
    Jan 10, 2013
    Posts:
    18
    Is there a nice clean way to get SU (Shape Units) the same way you are getting the Animation Units?
     
  39. Synedge

    Synedge

    Joined:
    Nov 21, 2012
    Posts:
    21
    This is currently not implemented in the package.
    As the source code is available, you could just as easily extend the interface and send the data from the KinetDataTransmitter into the KinectBinder in Unity in a similar fashion as the AU are being sent.

    The only problem is that you will also have to extent the Microsoft.Kinect.Toolkit.Facetracking project has the c# api for the Shape Units seems to have been forgotten. Here is a quick way to do it (but I did not had time to try it):
    - Open the FaceTracker.cs from the
    - Create a new method that basically wraps this line: faceTrackerInteropPtr.GetShapeUnits(...);
    Here is the method definition:

    Code (csharp):
    1.         /// <summary>
    2.         /// Returns shape units (SUs) that the face tracker is using. If the passed ppSUCoefs parameter is NULL, it returns number of SUs used in the loaded face model.
    3.         /// </summary>
    4.         /// <param name="scale">A pointer to a head scale variable.</param>
    5.         /// <param name="shapeUnitCoeffsPtr">A pointer to a float array of shape unit coefficients. The array must be large enough to contain all of the SUs for the loaded face model.</param>
    6.         /// <param name="shapeUnitCount">Number of returned shape unit coefficients. This parameter is IN/OUT and must be initialized to the size of the *ppSUCoefs array when passed in.</param>
    7.         /// <param name="haveConverged">true if shape unit coefficients converged to realistic values; otherwise, false (the SU coefficients are still converging).</param>
    8.         /// If the method succeeds, the return value is S_OK. If the method fails, the return value can be FT_ERROR_UNINITIALIZED, E_INVALIDARG, E_POINTER.
    9.         /// STDMETHOD(GetShapeUnits)(THIS_ FLOAT* pScale, FLOAT** ppSUCoefs, UINT* pSUCount, BOOL* pHaveConverged) PURE;
    10.         void GetShapeUnits(out float scale, out IntPtr shapeUnitCoeffsPtr, [In, Out] ref uint shapeUnitCount, [MarshalAs(UnmanagedType.Bool)] out bool haveConverged);
    - Use the interface. Go to the KinectDataTransmitter project and open the FaceTracker.cs file. Go to the OnFrameReady method. Then it should be something like:
    Code (csharp):
    1. var shapeUnits = this._faceTracker.GetShapeUnits(/* or whatever you called the previous method you previously created*/);
    2. Converter.EncodeShapeUnitsData(shapeUnits);
    Then for the rest you can almost copy paste the way the animation units are being sent from the transmitter and read in the KinectBinder.
     
  40. lucasc9x

    lucasc9x

    Joined:
    Jul 23, 2012
    Posts:
    1
    Hi i'm a TI student and i've been testing a release of facetracking starter kit but i'm having the some problem that dont let the sample project work, just after the kinect start tracking me, the 3d model start spinning and goes to a weird position of the scene, what could it possibly be?
    i need to track and show the current tracking points at my face, is that possible with that asset?
     
  41. CrisCL

    CrisCL

    Joined:
    Mar 9, 2013
    Posts:
    14
    Hi!
    I want to buy the Facetracking starter kit in the Asset store, but I have some questions first.

    Can I do with this asset, the same effect made in the "Face Tracking 3D" official sample of the Kinect SDK?
    I mean, generate a 3D model of any face and put it with a bigger scale over a video tracking a head's position?

    $FaceTracking3D.png

    I want to do the "jibjab" effect , but with a 3D model of any face scanned with the kinect (just like the 3D model obtained in the "Face Tracking 3D" official sample), specially including the original face´s rotation and gesture.

    Can I register the original gesture and reproduce it over another 3D face scanned?

    ¿Can I do that with the Facetracking starter kit? ¿Can you please guide me about it?
    I will appreciate any help about it...
    Thanks for your help!

    Greetings!

    Chris
     
  42. wingrider888

    wingrider888

    Joined:
    Oct 27, 2010
    Posts:
    38
  43. XeviaN360

    XeviaN360

    Joined:
    Jun 3, 2010
    Posts:
    181
    Hi, with Facetracking starter kit is it possible to detect a smile?
    Is it possible to track more than one person?


    Thank you
     
  44. Darshanpreet

    Darshanpreet

    Joined:
    Sep 26, 2013
    Posts:
    5
    Hi, I have bought and tried the starter kit. It is working great with the current character i.e. Oni. There is no documentation on using our own face or character. Can you please help with this as without this documentation it is more or less a game play and not helping in further development.

    Thanks
     
  45. Synedge

    Synedge

    Joined:
    Nov 21, 2012
    Posts:
    21
    Hello! and sorry for the late reply.

    This bug has been solved in the latest version. The only reason I can possibly image for this to happen would be that you are tracking multiple persons at once?
    Is this what is happening?

    About the second part of your question, I am not sure if you are talking about the 2D points? If that is the case you will have to implement it by yourself (you can follow the same structure used to get the facetracking information into unity, it should be pretty straightforward).
     
  46. Synedge

    Synedge

    Joined:
    Nov 21, 2012
    Posts:
    21
    While it is certainly possible to do it, I do not have the time to currently guide you.
    Basically you should extract the portion of the RGB camera feed that corresponds to the head and map it on a custom mask. Make the custom mask the approximate size of the head (which you can get from the kinect sdk, plus some calculations maybe) and then animate it with the information from the facetracking component.
    While our kit will surely help start faster, it will not do all of this out of the box. You will need to program it yourself.
     
  47. Synedge

    Synedge

    Joined:
    Nov 21, 2012
    Posts:
    21
    It is possible to detect a smile, if you program what a smile means (e.g. the information you get is that the user has the mouth open at 50% and the lips are stretched at 75%).

    It is possible to track more than one person, but our kit only send information about the first tracked person. I guess you can easily extend it on that point by yourself as the source code is available.
     
  48. Synedge

    Synedge

    Joined:
    Nov 21, 2012
    Posts:
    21
    Strange, I though we did include some information on this in the readme.
    What is included in the kit is just an example.

    You can do this in many different ways. The one we used is a simple pose animation technique were you have a base shape (neutral) and different poses (e.g. stretching lips, eye brows raised).
    The kinect sdk sends us information about how much the person is e.g. stretching his lips, raising his eyebrows, and then we simply animate the mask accordingly.

    You can use the same technique (i.e. create a pose for each animation unit on top of the base pose) or you can rig your mask and animate the bones base on the very same information.

    Best regards,
    Eurico
     
  49. tomihr2

    tomihr2

    Joined:
    Oct 25, 2010
    Posts:
    29
    Hello,

    Is your kit able to have camera stream connected to the mask, like in those glasses fitting applications?

    Thanks
     
  50. spoons

    spoons

    Joined:
    Mar 27, 2014
    Posts:
    1
    I believe I have everything set up properly; however, I keep getting the error "The kinect sensor is status changed to: SensorError." Any help? I'm using the Xbox 360 Kinect and have the SDK installed properly.