Search Unity

[RELEASED] Dlib FaceLandmark Detector

Discussion in 'Assets and Asset Store' started by EnoxSoftware, Jun 4, 2016.

  1. Alcyone

    Alcyone

    Joined:
    Nov 15, 2013
    Posts:
    13
    I am draging a 3D modeled object into the Webcam AR sample, but it is hidden, or I only see some poligons, I have tried almost all (changing materials, mesh filter....) Could you tell me how to add a complex 3D object to the human_head? Any tutorial coming?
     
  2. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    LowPolyUnityChan(http://unity-chan.com/download/download.php?id=LowPolyUnityChan&v=1.0)

    change_dlib_object_0.png
    change_dlib_object_1.png
     
  3. MMO-Developer

    MMO-Developer

    Joined:
    Jul 2, 2012
    Posts:
    25
    How can i swap face with some .jpg or .png ? In faceswaper sample ,VideoCaptureFaceSwapperSample i want to swap face with some jpg ?
     
    Last edited: Sep 28, 2016
  4. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Texture2DToMatSample can swap the face in one image.
     
  5. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
    iOS 9 and 10 on iPhone 6S/7 unity 5.4.0f3, latest opencv for unity
     
  6. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
    Is there an example for getting Vuforia and OpenCV for Unity working with the Dlib AR sample? How you map the different camera's, etc? I was able to get the github repo's general non-AR gameobject example working. But, I'm having some difficulty with mapping the gameobject's.
     
  7. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    It is the first time that such a bug was reported.
    Could you email me ScreenShot? store@enoxsoftware.com
     
  8. Otto_Oliveira

    Otto_Oliveira

    Joined:
    Jul 24, 2014
    Posts:
    33
    hi,
    how can I change the center of web cam image? Take a look on the images:
    zero.png Original web cam texture.
    up.png Y Axis translated, that's the desired result, but if you just translate `quad` game object the 'head' model still on the center of screen.
     
  9. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
    Just checking
     
  10. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    for now, there is not an example for getting Vuforia and OpenCV for Unity working with the Dlib AR sample.
     
  11. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Please add "offsetY".
    Code (CSharp):
    1.             int max_d = Mathf.Max (webCamTextureMat.rows (), webCamTextureMat.cols ());
    2.             camMatrix = new Mat (3, 3, CvType.CV_64FC1);
    3.             camMatrix.put (0, 0, max_d);
    4.             camMatrix.put (0, 1, 0);
    5.             camMatrix.put (0, 2, webCamTextureMat.cols () / 2.0f);
    6.             camMatrix.put (1, 0, 0);
    7.             camMatrix.put (1, 1, max_d);
    8.             camMatrix.put (1, 2, webCamTextureMat.rows () / 2.0f + offsetY);
    9.             camMatrix.put (2, 0, 0);
    10.             camMatrix.put (2, 1, 0);
    11.             camMatrix.put (2, 2, 1.0f);
     
    Otto_Oliveira likes this.
  12. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
    do you know if it is possible to have this example working with vuforia or to have the AR samples in OpenCVforUnity with Vuforia
     
  13. Otto_Oliveira

    Otto_Oliveira

    Joined:
    Jul 24, 2014
    Posts:
    33
    Thanks! It worked perfecly!
     
  14. Otto_Oliveira

    Otto_Oliveira

    Joined:
    Jul 24, 2014
    Posts:
    33
    @EnoxSoftware do you have any idea why the FPS increase when you touch the device screen on Android? By the way it's very slow on that platform, just like Mac Editor. On iOS it runs reasonable.
     
  15. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
  16. Deleted User

    Deleted User

    Guest

  17. Deleted User

    Deleted User

    Guest

    And I have a last question, is it possible to create a mesh from script with the 68 facial landmarks points and apply an alpha texture ?

    Cheers,
    L.
     
    ina likes this.
  18. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    DlibFaceLandmarkDetector does not support the CNN based face detector.

    I think it is possible.
     
  19. Curious

    Curious

    Joined:
    Nov 19, 2009
    Posts:
    334
    How hard would it be to train or use a trained Dlib dataset and use it on say for detecting and tracking people?
    Is there any people tracking trained dataset to be used with your asset?
    Example:
     
  20. minimalmally

    minimalmally

    Joined:
    Oct 21, 2016
    Posts:
    12
    Do you have any tutorial or know of any that will show me how to replace the mouth, nose, eyes. ears?
     
    ina likes this.
  21. Curious

    Curious

    Joined:
    Nov 19, 2009
    Posts:
    334
    Is it GPU accelerated? (CUDA and cuDNN support if available?)
     
  22. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
  23. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
  24. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
  25. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Deleted User likes this.
  26. Deleted User

    Deleted User

    Guest

    @EnoxSoftware ,

    Do you think that cropping the webcamera matrix preview, depending on the orientation of the mobile, like 15% rom left and right side if landscape mode, or 15% of top/bottom around the webcamera input, but let the full screen texture liveplayed would help the performances to go a little bit more faster ?

    I am mentioning because it seems that MSQRD, the app, is doing so. I guess that Snapchat is doing the same as I saw lots of computer vision frameworks leveraging selective search, margin cropping or level of pyramids parameters for performances improvements.

     
    ina likes this.
  27. Deleted User

    Deleted User

    Guest

    EnoxSoftware likes this.
  28. minimalmally

    minimalmally

    Joined:
    Oct 21, 2016
    Posts:
    12
    I was wondering if you have any write up or tutorial on how to replace the (objects) in the face tracking sample. Let's say I want to put on a santa hat, santa nose and beard. How can I place these items into the ARObject and assign them in Quad.
     
    Last edited: Oct 25, 2016
  29. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    As you say, when you crop the size of the Mat, performance will be better.
     
  30. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
  31. minimalmally

    minimalmally

    Joined:
    Oct 21, 2016
    Posts:
    12
  32. minimalmally

    minimalmally

    Joined:
    Oct 21, 2016
    Posts:
    12
    why is all the 3d stuff I put into arobject soo tiny? what's the scale I should be using?
     
  33. minimalmally

    minimalmally

    Joined:
    Oct 21, 2016
    Posts:
    12
    One last question, when I export to iOS the screen is not full screen, it's like 288x320 how do I make the export full screen?
     
  34. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    You do not need to edit the script.
    1. Place the object you want to display as a child of ARObjects.
    2. In order to be displayed on the ARCamera, please set the layer of the object to "UI".
     
    minimalmally likes this.
  35. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    You need to change "if (widthScale > heightScale)" to "if (widthScale > heightScale)".
    Code (CSharp):
    1.  
    2.             if (widthScale > heightScale) {
    3.                 Camera.main.orthographicSize = (width * (float)Screen.height / (float)Screen.width) / 2;
    4.                 imageSizeScale = (float)Screen.height / (float)Screen.width;
    5.             } else {
    6.                 Camera.main.orthographicSize = height / 2;
    7.             }
    8.  
    Code (CSharp):
    1.             if (widthScale > heightScale) {
    2.                 ARCamera.fieldOfView = (float)(fovx [0] * fovXScale);
    3.             } else {
    4.                 ARCamera.fieldOfView = (float)(fovy [0] * fovYScale);
    5.             }
    dlibfacelandmarkdetector_fullscreen.PNG
     
    minimalmally likes this.
  36. minimalmally

    minimalmally

    Joined:
    Oct 21, 2016
    Posts:
    12

    Hi I tried that and got the following errors

    Assets/DlibFaceLandmarkDetectorWithOpenCVSample/VideoCaptureARSample/VideoCaptureARSample.cs(235,64): error CS0103: The name `fovx' does not exist in the current context

    Assets/DlibFaceLandmarkDetectorWithOpenCVSample/VideoCaptureARSample/VideoCaptureARSample.cs(235,64): error CS0103: The name `fovy' does not exist in the current context
     

    Attached Files:

  37. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    You need to update to Dlib FaceLandmark Detector v1.0.2.
    https://www.assetstore.unity3d.com/en/#!/content/64314
     
  38. Gunhi

    Gunhi

    Joined:
    Apr 18, 2012
    Posts:
    300
    Just bought and played around with you asset. There are few things.
    - The Optimized Sample only work with OpenCV for Unity. We need a sample that work with FaceLandmark. Cause the included sample extremely slows down device.
    - Also, please included sample that has AR sample such as: EysLaser, MouthLaser.
    Thanks
     
    EnoxSoftware likes this.
  39. minimalmally

    minimalmally

    Joined:
    Oct 21, 2016
    Posts:
    12
    When you hold the camera still and you're not moving, the ARObjects shakes a bit, its very noisey is there anyway to not update the arobject if there's very tiny changes?
     
  40. minimalmally

    minimalmally

    Joined:
    Oct 21, 2016
    Posts:
    12

    Attached Files:

  41. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    if (widthScale < heightScale) {
    to
    if (widthScale > heightScale) {

    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3. using System.Collections.Generic;
    4.  
    5. #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    6. using UnityEngine.SceneManagement;
    7. #endif
    8. using OpenCVForUnity;
    9. using DlibFaceLandmarkDetector;
    10.  
    11. namespace DlibFaceLandmarkDetectorSample
    12. {
    13.     /// <summary>
    14.     /// Face tracker AR from WebCamTexture Sample.
    15.     /// This sample was referring to http://www.morethantechnical.com/2012/10/17/head-pose-estimation-with-opencv-opengl-revisited-w-code/
    16.     /// and use effect asset from http://ktk-kumamoto.hatenablog.com/entry/2014/09/14/092400
    17.     /// </summary>
    18.     [RequireComponent(typeof(WebCamTextureToMatHelper))]
    19.     public class WebCamTextureARSample : MonoBehaviour
    20.     {
    21.         /// <summary>
    22.         /// The should draw face points.
    23.         /// </summary>
    24.         public bool shouldDrawFacePoints;
    25.  
    26.         /// <summary>
    27.         /// The should draw axes.
    28.         /// </summary>
    29.         public bool shouldDrawAxes;
    30.  
    31.         /// <summary>
    32.         /// The should draw head.
    33.         /// </summary>
    34.         public bool shouldDrawHead;
    35.  
    36.         /// <summary>
    37.         /// The should draw effects.
    38.         /// </summary>
    39.         public bool shouldDrawEffects;
    40.        
    41.         /// <summary>
    42.         /// The axes.
    43.         /// </summary>
    44.         public GameObject axes;
    45.        
    46.         /// <summary>
    47.         /// The head.
    48.         /// </summary>
    49.         public GameObject head;
    50.        
    51.         /// <summary>
    52.         /// The right eye.
    53.         /// </summary>
    54.         public GameObject rightEye;
    55.        
    56.         /// <summary>
    57.         /// The left eye.
    58.         /// </summary>
    59.         public GameObject leftEye;
    60.        
    61.         /// <summary>
    62.         /// The mouth.
    63.         /// </summary>
    64.         public GameObject mouth;
    65.  
    66.         /// <summary>
    67.         /// The mouth particle system.
    68.         /// </summary>
    69.         ParticleSystem[] mouthParticleSystem;
    70.        
    71.         /// <summary>
    72.         /// The texture.
    73.         /// </summary>
    74.         Texture2D texture;
    75.  
    76.         /// <summary>
    77.         /// The face landmark detector.
    78.         /// </summary>
    79.         FaceLandmarkDetector faceLandmarkDetector;
    80.        
    81.         /// <summary>
    82.         /// The AR camera.
    83.         /// </summary>
    84.         public Camera ARCamera;
    85.        
    86.         /// <summary>
    87.         /// The cam matrix.
    88.         /// </summary>
    89.         Mat camMatrix;
    90.        
    91.         /// <summary>
    92.         /// The dist coeffs.
    93.         /// </summary>
    94.         MatOfDouble distCoeffs;
    95.        
    96.         /// <summary>
    97.         /// The invert Y.
    98.         /// </summary>
    99.         Matrix4x4 invertYM;
    100.        
    101.         /// <summary>
    102.         /// The transformation m.
    103.         /// </summary>
    104.         Matrix4x4 transformationM = new Matrix4x4 ();
    105.        
    106.         /// <summary>
    107.         /// The invert Z.
    108.         /// </summary>
    109.         Matrix4x4 invertZM;
    110.        
    111.         /// <summary>
    112.         /// The ar m.
    113.         /// </summary>
    114.         Matrix4x4 ARM;
    115.  
    116.         /// <summary>
    117.         /// The ar game object.
    118.         /// </summary>
    119.         public GameObject ARGameObject;
    120.  
    121.         /// <summary>
    122.         /// The should move AR camera.
    123.         /// </summary>
    124.         public bool shouldMoveARCamera;
    125.        
    126.         /// <summary>
    127.         /// The 3d face object points.
    128.         /// </summary>
    129.         MatOfPoint3f objectPoints;
    130.        
    131.         /// <summary>
    132.         /// The image points.
    133.         /// </summary>
    134.         MatOfPoint2f imagePoints;
    135.        
    136.         /// <summary>
    137.         /// The rvec.
    138.         /// </summary>
    139.         Mat rvec;
    140.        
    141.         /// <summary>
    142.         /// The tvec.
    143.         /// </summary>
    144.         Mat tvec;
    145.        
    146.         /// <summary>
    147.         /// The rot m.
    148.         /// </summary>
    149.         Mat rotM;
    150.  
    151.         /// <summary>
    152.         /// The web cam texture to mat helper.
    153.         /// </summary>
    154.         WebCamTextureToMatHelper webCamTextureToMatHelper;
    155.        
    156.         // Use this for initialization
    157.         void Start ()
    158.         {
    159.             //set 3d face object points.
    160.             objectPoints = new MatOfPoint3f (
    161.                 new Point3 (-31, 72, 86),//l eye
    162.                 new Point3 (31, 72, 86),//r eye
    163.                 new Point3 (0, 40, 114),//nose
    164.                 new Point3 (-20, 15, 90),//l mouse
    165.                 new Point3 (20, 15, 90),//r mouse
    166.                 new Point3 (-69, 76, -2),//l ear
    167.                 new Point3 (69, 76, -2)//r ear
    168.             );
    169.             imagePoints = new MatOfPoint2f ();
    170.             rvec = new Mat ();
    171.             tvec = new Mat ();
    172.             rotM = new Mat (3, 3, CvType.CV_64FC1);
    173.  
    174.             faceLandmarkDetector = new FaceLandmarkDetector (DlibFaceLandmarkDetector.Utils.getFilePath ("shape_predictor_68_face_landmarks.dat"));
    175.  
    176.             webCamTextureToMatHelper = gameObject.GetComponent<WebCamTextureToMatHelper> ();
    177.             webCamTextureToMatHelper.Init ();
    178.         }
    179.  
    180.         /// <summary>
    181.         /// Raises the web cam texture to mat helper inited event.
    182.         /// </summary>
    183.         public void OnWebCamTextureToMatHelperInited ()
    184.         {
    185.             Debug.Log ("OnWebCamTextureToMatHelperInited");
    186.            
    187.             Mat webCamTextureMat = webCamTextureToMatHelper.GetMat ();
    188.            
    189.             texture = new Texture2D (webCamTextureMat.cols (), webCamTextureMat.rows (), TextureFormat.RGBA32, false);
    190.  
    191.             gameObject.GetComponent<Renderer> ().material.mainTexture = texture;
    192.  
    193.             gameObject.transform.localScale = new Vector3 (webCamTextureMat.cols (), webCamTextureMat.rows (), 1);
    194.             Debug.Log ("Screen.width " + Screen.width + " Screen.height " + Screen.height + " Screen.orientation " + Screen.orientation);
    195.  
    196.  
    197.             float width = webCamTextureMat.width ();
    198.             float height = webCamTextureMat.height ();
    199.            
    200.             float imageSizeScale = 1.0f;
    201.             float widthScale = (float)Screen.width / width;
    202.             float heightScale = (float)Screen.height / height;
    203.             if (widthScale > heightScale) {
    204.                 Camera.main.orthographicSize = (width * (float)Screen.height / (float)Screen.width) / 2;
    205.                 imageSizeScale = (float)Screen.height / (float)Screen.width;
    206.             } else {
    207.                 Camera.main.orthographicSize = height / 2;
    208.             }
    209.            
    210.            
    211.             //set cameraparam
    212.             int max_d = (int)Mathf.Max (width, height);
    213.             double fx = max_d;
    214.             double fy = max_d;
    215.             double cx = width / 2.0f;
    216.             double cy = height / 2.0f;
    217.             camMatrix = new Mat (3, 3, CvType.CV_64FC1);
    218.             camMatrix.put (0, 0, fx);
    219.             camMatrix.put (0, 1, 0);
    220.             camMatrix.put (0, 2, cx);
    221.             camMatrix.put (1, 0, 0);
    222.             camMatrix.put (1, 1, fy);
    223.             camMatrix.put (1, 2, cy);
    224.             camMatrix.put (2, 0, 0);
    225.             camMatrix.put (2, 1, 0);
    226.             camMatrix.put (2, 2, 1.0f);
    227.             Debug.Log ("camMatrix " + camMatrix.dump ());
    228.            
    229.            
    230.             distCoeffs = new MatOfDouble (0, 0, 0, 0);
    231.             Debug.Log ("distCoeffs " + distCoeffs.dump ());
    232.            
    233.            
    234.             //calibration camera
    235.             Size imageSize = new Size (width * imageSizeScale, height * imageSizeScale);
    236.             double apertureWidth = 0;
    237.             double apertureHeight = 0;
    238.             double[] fovx = new double[1];
    239.             double[] fovy = new double[1];
    240.             double[] focalLength = new double[1];
    241.             Point principalPoint = new Point (0, 0);
    242.             double[] aspectratio = new double[1];
    243.            
    244.             Calib3d.calibrationMatrixValues (camMatrix, imageSize, apertureWidth, apertureHeight, fovx, fovy, focalLength, principalPoint, aspectratio);
    245.            
    246.             Debug.Log ("imageSize " + imageSize.ToString ());
    247.             Debug.Log ("apertureWidth " + apertureWidth);
    248.             Debug.Log ("apertureHeight " + apertureHeight);
    249.             Debug.Log ("fovx " + fovx [0]);
    250.             Debug.Log ("fovy " + fovy [0]);
    251.             Debug.Log ("focalLength " + focalLength [0]);
    252.             Debug.Log ("principalPoint " + principalPoint.ToString ());
    253.             Debug.Log ("aspectratio " + aspectratio [0]);
    254.            
    255.            
    256.             //To convert the difference of the FOV value of the OpenCV and Unity.
    257.             double fovXScale = (2.0 * Mathf.Atan ((float)(imageSize.width / (2.0 * fx)))) / (Mathf.Atan2 ((float)cx, (float)fx) + Mathf.Atan2 ((float)(imageSize.width - cx), (float)fx));
    258.             double fovYScale = (2.0 * Mathf.Atan ((float)(imageSize.height / (2.0 * fy)))) / (Mathf.Atan2 ((float)cy, (float)fy) + Mathf.Atan2 ((float)(imageSize.height - cy), (float)fy));
    259.            
    260.             Debug.Log ("fovXScale " + fovXScale);
    261.             Debug.Log ("fovYScale " + fovYScale);
    262.            
    263.            
    264.             //Adjust Unity Camera FOV https://github.com/opencv/opencv/commit/8ed1945ccd52501f5ab22bdec6aa1f91f1e2cfd4
    265.             if (widthScale > heightScale) {
    266.                 ARCamera.fieldOfView = (float)(fovx [0] * fovXScale);
    267.             } else {
    268.                 ARCamera.fieldOfView = (float)(fovy [0] * fovYScale);
    269.             }
    270.                                    
    271.                                    
    272.                                    
    273.             invertYM = Matrix4x4.TRS (Vector3.zero, Quaternion.identity, new Vector3 (1, -1, 1));
    274.             Debug.Log ("invertYM " + invertYM.ToString ());
    275.            
    276.             invertZM = Matrix4x4.TRS (Vector3.zero, Quaternion.identity, new Vector3 (1, 1, -1));
    277.             Debug.Log ("invertZM " + invertZM.ToString ());
    278.            
    279.            
    280.             axes.SetActive (false);
    281.             head.SetActive (false);
    282.             rightEye.SetActive (false);
    283.             leftEye.SetActive (false);
    284.             mouth.SetActive (false);
    285.  
    286.  
    287.             mouthParticleSystem = mouth.GetComponentsInChildren<ParticleSystem> (true);
    288.  
    289.         }
    290.        
    291.         /// <summary>
    292.         /// Raises the web cam texture to mat helper disposed event.
    293.         /// </summary>
    294.         public void OnWebCamTextureToMatHelperDisposed ()
    295.         {
    296.             Debug.Log ("OnWebCamTextureToMatHelperDisposed");
    297.  
    298.             camMatrix.Dispose ();
    299.             distCoeffs.Dispose ();
    300.         }
    301.  
    302.         // Update is called once per frame
    303.         void Update ()
    304.         {
    305.  
    306.             if (webCamTextureToMatHelper.isPlaying () && webCamTextureToMatHelper.didUpdateThisFrame ()) {
    307.                
    308.                 Mat rgbaMat = webCamTextureToMatHelper.GetMat ();
    309.  
    310.  
    311.                 OpenCVForUnityUtils.SetImage (faceLandmarkDetector, rgbaMat);
    312.  
    313.                 //detect face rects
    314.                 List<UnityEngine.Rect> detectResult = faceLandmarkDetector.Detect ();
    315.  
    316.                 if (detectResult.Count > 0) {
    317.  
    318.                     //detect landmark points
    319.                     List<Vector2> points = faceLandmarkDetector.DetectLandmark (detectResult [0]);
    320.  
    321.                     if (points.Count > 0) {
    322.                         if (shouldDrawFacePoints)
    323.                             OpenCVForUnityUtils.DrawFaceLandmark (rgbaMat, points, new Scalar (0, 255, 0, 255), 2);
    324.  
    325.                         imagePoints.fromArray (
    326.                             new Point ((points [38].x + points [41].x) / 2, (points [38].y + points [41].y) / 2),//l eye
    327.                             new Point ((points [43].x + points [46].x) / 2, (points [43].y + points [46].y) / 2),//r eye
    328.                             new Point (points [33].x, points [33].y),//nose
    329.                             new Point (points [48].x, points [48].y),//l mouth
    330.                             new Point (points [54].x, points [54].y) //r mouth
    331.                                                         ,
    332.                             new Point (points [0].x, points [0].y),//l ear
    333.                             new Point (points [16].x, points [16].y)//r ear
    334.                         );
    335.                                                                        
    336.                                                                        
    337.                         Calib3d.solvePnP (objectPoints, imagePoints, camMatrix, distCoeffs, rvec, tvec);
    338.  
    339.  
    340.                         if (tvec.get (2, 0) [0] > 0) {
    341.  
    342.                             if (Mathf.Abs ((float)(points [43].y - points [46].y)) > Mathf.Abs ((float)(points [42].x - points [45].x)) / 6.0) {
    343.                                 if (shouldDrawEffects)
    344.                                     rightEye.SetActive (true);
    345.                             }
    346.  
    347.                             if (Mathf.Abs ((float)(points [38].y - points [41].y)) > Mathf.Abs ((float)(points [39].x - points [36].x)) / 6.0) {
    348.                                 if (shouldDrawEffects)
    349.                                     leftEye.SetActive (true);
    350.                             }
    351.                             if (shouldDrawHead)
    352.                                 head.SetActive (true);
    353.                             if (shouldDrawAxes)
    354.                                 axes.SetActive (true);
    355.                                                    
    356.                                                    
    357.                             float noseDistance = Mathf.Abs ((float)(points [27].y - points [33].y));
    358.                             float mouseDistance = Mathf.Abs ((float)(points [62].y - points [66].y));
    359.                             if (mouseDistance > noseDistance / 5.0) {
    360.                                 if (shouldDrawEffects) {
    361.                                     mouth.SetActive (true);
    362.                                     foreach (ParticleSystem ps in mouthParticleSystem) {
    363.                                         ps.enableEmission = true;
    364.                                         ps.startSize = 40 * (mouseDistance / noseDistance);
    365.                                     }
    366.                                 }
    367.                             } else {
    368.                                 if (shouldDrawEffects) {
    369.                                     foreach (ParticleSystem ps in mouthParticleSystem) {
    370.                                         ps.enableEmission = false;
    371.                                     }
    372.                                 }
    373.                             }
    374.  
    375.                                                    
    376.                             Calib3d.Rodrigues (rvec, rotM);
    377.                                                    
    378.                             transformationM .SetRow (0, new Vector4 ((float)rotM.get (0, 0) [0], (float)rotM.get (0, 1) [0], (float)rotM.get (0, 2) [0], (float)tvec.get (0, 0) [0]));
    379.                             transformationM.SetRow (1, new Vector4 ((float)rotM.get (1, 0) [0], (float)rotM.get (1, 1) [0], (float)rotM.get (1, 2) [0], (float)tvec.get (1, 0) [0]));
    380.                             transformationM.SetRow (2, new Vector4 ((float)rotM.get (2, 0) [0], (float)rotM.get (2, 1) [0], (float)rotM.get (2, 2) [0], (float)tvec.get (2, 0) [0]));
    381.                             transformationM.SetRow (3, new Vector4 (0, 0, 0, 1));
    382.                                                    
    383.                             if (shouldMoveARCamera) {
    384.  
    385.                                 if (ARGameObject != null) {
    386.                                     ARM = ARGameObject.transform.localToWorldMatrix * invertZM * transformationM.inverse * invertYM;
    387.                                     ARUtils.SetTransformFromMatrix (ARCamera.transform, ref ARM);
    388.                                     ARGameObject.SetActive (true);
    389.                                 }
    390.                             } else {
    391.                                 ARM = ARCamera.transform.localToWorldMatrix * invertYM * transformationM * invertZM;
    392.  
    393.                                 if (ARGameObject != null) {
    394.                                     ARUtils.SetTransformFromMatrix (ARGameObject.transform, ref ARM);
    395.                                     ARGameObject.SetActive (true);
    396.                                 }
    397.                             }
    398.  
    399.                         }
    400.                     }
    401.                 }
    402.                                        
    403.                 Imgproc.putText (rgbaMat, "W:" + rgbaMat.width () + " H:" + rgbaMat.height () + " SO:" + Screen.orientation, new Point (5, rgbaMat.rows () - 10), Core.FONT_HERSHEY_SIMPLEX, 0.5, new Scalar (255, 255, 255, 255), 1, Imgproc.LINE_AA, false);
    404.                                        
    405.                 OpenCVForUnity.Utils.matToTexture2D (rgbaMat, texture, webCamTextureToMatHelper.GetBufferColors ());
    406.                                        
    407.             }
    408.                    
    409.         }
    410.                
    411.         /// <summary>
    412.         /// Raises the disable event.
    413.         /// </summary>
    414.         void OnDisable ()
    415.         {
    416.             webCamTextureToMatHelper.Dispose ();
    417.  
    418.             faceLandmarkDetector.Dispose ();
    419.         }
    420.        
    421.         /// <summary>
    422.         /// Raises the back button event.
    423.         /// </summary>
    424.         public void OnBackButton ()
    425.         {
    426.             #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    427.             SceneManager.LoadScene ("DlibFaceLandmarkDetectorSample");
    428.             #else
    429.             Application.LoadLevel ("DlibFaceLandmarkDetectorSample");
    430.             #endif
    431.         }
    432.        
    433.         /// <summary>
    434.         /// Raises the play button event.
    435.         /// </summary>
    436.         public void OnPlayButton ()
    437.         {
    438.             webCamTextureToMatHelper.Play ();
    439.         }
    440.        
    441.         /// <summary>
    442.         /// Raises the pause button event.
    443.         /// </summary>
    444.         public void OnPauseButton ()
    445.         {
    446.             webCamTextureToMatHelper.Pause ();
    447.         }
    448.        
    449.         /// <summary>
    450.         /// Raises the stop button event.
    451.         /// </summary>
    452.         public void OnStopButton ()
    453.         {
    454.             webCamTextureToMatHelper.Stop ();
    455.         }
    456.        
    457.         /// <summary>
    458.         /// Raises the change camera button event.
    459.         /// </summary>
    460.         public void OnChangeCameraButton ()
    461.         {
    462.             webCamTextureToMatHelper.Init (null, webCamTextureToMatHelper.requestWidth, webCamTextureToMatHelper.requestHeight, !webCamTextureToMatHelper.requestIsFrontFacing);
    463.         }
    464.                
    465.         public void OnDrawFacePointsButton ()
    466.         {
    467.             if (shouldDrawFacePoints) {
    468.                 shouldDrawFacePoints = false;
    469.             } else {
    470.                 shouldDrawFacePoints = true;
    471.             }
    472.         }
    473.                
    474.         public void OnDrawAxesButton ()
    475.         {
    476.             if (shouldDrawAxes) {
    477.                 shouldDrawAxes = false;
    478.                 axes.SetActive (false);
    479.             } else {
    480.                 shouldDrawAxes = true;
    481.             }
    482.         }
    483.                
    484.         public void OnDrawHeadButton ()
    485.         {
    486.             if (shouldDrawHead) {
    487.                 shouldDrawHead = false;
    488.                 head.SetActive (false);
    489.             } else {
    490.                 shouldDrawHead = true;
    491.             }
    492.         }
    493.  
    494.         public void OnDrawEffectsButton ()
    495.         {
    496.             if (shouldDrawEffects) {
    497.                 shouldDrawEffects = false;
    498.                 rightEye.SetActive (false);
    499.                 leftEye.SetActive (false);
    500.                 mouth.SetActive (false);
    501.             } else {
    502.                 shouldDrawEffects = true;
    503.             }
    504.         }
    505.  
    506.     }
    507. }
    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3. using System.Collections.Generic;
    4.  
    5. #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    6. using UnityEngine.SceneManagement;
    7. #endif
    8. using OpenCVForUnity;
    9. using DlibFaceLandmarkDetector;
    10.  
    11. namespace DlibFaceLandmarkDetectorSample
    12. {
    13.     /// <summary>
    14.     /// Face tracker AR from VideoCapture Sample.
    15.     /// This sample was referring to http://www.morethantechnical.com/2012/10/17/head-pose-estimation-with-opencv-opengl-revisited-w-code/
    16.     /// and use effect asset from http://ktk-kumamoto.hatenablog.com/entry/2014/09/14/092400
    17.     /// </summary>
    18.     public class VideoCaptureARSample : MonoBehaviour
    19.     {
    20.         /// <summary>
    21.         /// The should draw face points.
    22.         /// </summary>
    23.         public bool shouldDrawFacePoints;
    24.  
    25.         /// <summary>
    26.         /// The should draw axes.
    27.         /// </summary>
    28.         public bool shouldDrawAxes;
    29.  
    30.         /// <summary>
    31.         /// The should draw head.
    32.         /// </summary>
    33.         public bool shouldDrawHead;
    34.  
    35.         /// <summary>
    36.         /// The should draw effects.
    37.         /// </summary>
    38.         public bool shouldDrawEffects;
    39.        
    40.         /// <summary>
    41.         /// The axes.
    42.         /// </summary>
    43.         public GameObject axes;
    44.        
    45.         /// <summary>
    46.         /// The head.
    47.         /// </summary>
    48.         public GameObject head;
    49.        
    50.         /// <summary>
    51.         /// The right eye.
    52.         /// </summary>
    53.         public GameObject rightEye;
    54.        
    55.         /// <summary>
    56.         /// The left eye.
    57.         /// </summary>
    58.         public GameObject leftEye;
    59.        
    60.         /// <summary>
    61.         /// The mouth.
    62.         /// </summary>
    63.         public GameObject mouth;
    64.  
    65.         /// <summary>
    66.         /// The mouth particle system.
    67.         /// </summary>
    68.         ParticleSystem[] mouthParticleSystem;
    69.        
    70.         /// <summary>
    71.         /// The colors.
    72.         /// </summary>
    73.         Color32[] colors;
    74.        
    75.         /// <summary>
    76.         /// The AR camera.
    77.         /// </summary>
    78.         public Camera ARCamera;
    79.        
    80.         /// <summary>
    81.         /// The cam matrix.
    82.         /// </summary>
    83.         Mat camMatrix;
    84.        
    85.         /// <summary>
    86.         /// The dist coeffs.
    87.         /// </summary>
    88.         MatOfDouble distCoeffs;
    89.        
    90.         /// <summary>
    91.         /// The invert Y.
    92.         /// </summary>
    93.         Matrix4x4 invertYM;
    94.        
    95.         /// <summary>
    96.         /// The transformation m.
    97.         /// </summary>
    98.         Matrix4x4 transformationM = new Matrix4x4 ();
    99.        
    100.         /// <summary>
    101.         /// The invert Z.
    102.         /// </summary>
    103.         Matrix4x4 invertZM;
    104.        
    105.         /// <summary>
    106.         /// The ar m.
    107.         /// </summary>
    108.         Matrix4x4 ARM;
    109.  
    110.         /// <summary>
    111.         /// The ar game object.
    112.         /// </summary>
    113.         public GameObject ARGameObject;
    114.  
    115.         /// <summary>
    116.         /// The should move AR camera.
    117.         /// </summary>
    118.         public bool shouldMoveARCamera;
    119.        
    120.         /// <summary>
    121.         /// The 3d face object points.
    122.         /// </summary>
    123.         MatOfPoint3f objectPoints;
    124.        
    125.         /// <summary>
    126.         /// The image points.
    127.         /// </summary>
    128.         MatOfPoint2f imagePoints;
    129.        
    130.         /// <summary>
    131.         /// The rvec.
    132.         /// </summary>
    133.         Mat rvec;
    134.        
    135.         /// <summary>
    136.         /// The tvec.
    137.         /// </summary>
    138.         Mat tvec;
    139.        
    140.         /// <summary>
    141.         /// The rot m.
    142.         /// </summary>
    143.         Mat rotM;
    144.  
    145.         /// <summary>
    146.         /// The width of the frame.
    147.         /// </summary>
    148.         private double frameWidth = 320;
    149.        
    150.         /// <summary>
    151.         /// The height of the frame.
    152.         /// </summary>
    153.         private double frameHeight = 240;
    154.        
    155.         /// <summary>
    156.         /// The capture.
    157.         /// </summary>
    158.         VideoCapture capture;
    159.        
    160.         /// <summary>
    161.         /// The rgb mat.
    162.         /// </summary>
    163.         Mat rgbMat;
    164.        
    165.         /// <summary>
    166.         /// The texture.
    167.         /// </summary>
    168.         Texture2D texture;
    169.        
    170.         /// <summary>
    171.         /// The face landmark detector.
    172.         /// </summary>
    173.         FaceLandmarkDetector faceLandmarkDetector;
    174.        
    175.         // Use this for initialization
    176.         void Start ()
    177.         {
    178.             //set 3d face object points.
    179.             objectPoints = new MatOfPoint3f (
    180.                 new Point3 (-31, 72, 86),//l eye
    181.                 new Point3 (31, 72, 86),//r eye
    182.                 new Point3 (0, 40, 114),//nose
    183.                 new Point3 (-20, 15, 90),//l mouse
    184.                 new Point3 (20, 15, 90),//r mouse
    185.                 new Point3 (-69, 76, -2),//l ear
    186.                 new Point3 (69, 76, -2)//r ear
    187.             );
    188.             imagePoints = new MatOfPoint2f ();
    189.             rvec = new Mat ();
    190.             tvec = new Mat ();
    191.             rotM = new Mat (3, 3, CvType.CV_64FC1);
    192.  
    193.             faceLandmarkDetector = new FaceLandmarkDetector (DlibFaceLandmarkDetector.Utils.getFilePath ("shape_predictor_68_face_landmarks.dat"));
    194.  
    195.             rgbMat = new Mat ();
    196.            
    197.             capture = new VideoCapture ();
    198.             capture.open (OpenCVForUnity.Utils.getFilePath ("dance.avi"));
    199.            
    200.             if (capture.isOpened ()) {
    201.                 Debug.Log ("capture.isOpened() true");
    202.             } else {
    203.                 Debug.Log ("capture.isOpened() false");
    204.             }
    205.            
    206.            
    207.             Debug.Log ("CAP_PROP_FORMAT: " + capture.get (Videoio.CAP_PROP_FORMAT));
    208.             Debug.Log ("CV_CAP_PROP_PREVIEW_FORMAT: " + capture.get (Videoio.CV_CAP_PROP_PREVIEW_FORMAT));
    209.             Debug.Log ("CAP_PROP_POS_MSEC: " + capture.get (Videoio.CAP_PROP_POS_MSEC));
    210.             Debug.Log ("CAP_PROP_POS_FRAMES: " + capture.get (Videoio.CAP_PROP_POS_FRAMES));
    211.             Debug.Log ("CAP_PROP_POS_AVI_RATIO: " + capture.get (Videoio.CAP_PROP_POS_AVI_RATIO));
    212.             Debug.Log ("CAP_PROP_FRAME_COUNT: " + capture.get (Videoio.CAP_PROP_FRAME_COUNT));
    213.             Debug.Log ("CAP_PROP_FPS: " + capture.get (Videoio.CAP_PROP_FPS));
    214.             Debug.Log ("CAP_PROP_FRAME_WIDTH: " + capture.get (Videoio.CAP_PROP_FRAME_WIDTH));
    215.             Debug.Log ("CAP_PROP_FRAME_HEIGHT: " + capture.get (Videoio.CAP_PROP_FRAME_HEIGHT));
    216.            
    217.             colors = new Color32[(int)(frameWidth) * (int)(frameHeight)];
    218.             texture = new Texture2D ((int)(frameWidth), (int)(frameHeight), TextureFormat.RGBA32, false);
    219.  
    220.             gameObject.GetComponent<Renderer> ().material.mainTexture = texture;
    221.  
    222.             gameObject.transform.localScale = new Vector3 ((float)frameWidth, (float)frameHeight, 1);
    223.            
    224.             Debug.Log ("Screen.width " + Screen.width + " Screen.height " + Screen.height + " Screen.orientation " + Screen.orientation);
    225.  
    226.  
    227.             float width = (float)frameWidth;
    228.             float height = (float)frameHeight;
    229.            
    230.             float imageSizeScale = 1.0f;
    231.             float widthScale = (float)Screen.width / width;
    232.             float heightScale = (float)Screen.height / height;
    233.             if (widthScale > heightScale) {
    234.                 Camera.main.orthographicSize = (width * (float)Screen.height / (float)Screen.width) / 2;
    235.                 imageSizeScale = (float)Screen.height / (float)Screen.width;
    236.             } else {
    237.                 Camera.main.orthographicSize = height / 2;
    238.             }
    239.            
    240.            
    241.             //set cameraparam
    242.             int max_d = (int)Mathf.Max (width, height);
    243.             double fx = max_d;
    244.             double fy = max_d;
    245.             double cx = width / 2.0f;
    246.             double cy = height / 2.0f;
    247.             camMatrix = new Mat (3, 3, CvType.CV_64FC1);
    248.             camMatrix.put (0, 0, fx);
    249.             camMatrix.put (0, 1, 0);
    250.             camMatrix.put (0, 2, cx);
    251.             camMatrix.put (1, 0, 0);
    252.             camMatrix.put (1, 1, fy);
    253.             camMatrix.put (1, 2, cy);
    254.             camMatrix.put (2, 0, 0);
    255.             camMatrix.put (2, 1, 0);
    256.             camMatrix.put (2, 2, 1.0f);
    257.             Debug.Log ("camMatrix " + camMatrix.dump ());
    258.            
    259.            
    260.             distCoeffs = new MatOfDouble (0, 0, 0, 0);
    261.             Debug.Log ("distCoeffs " + distCoeffs.dump ());
    262.            
    263.            
    264.             //calibration camera
    265.             Size imageSize = new Size (width * imageSizeScale, height * imageSizeScale);
    266.             double apertureWidth = 0;
    267.             double apertureHeight = 0;
    268.             double[] fovx = new double[1];
    269.             double[] fovy = new double[1];
    270.             double[] focalLength = new double[1];
    271.             Point principalPoint = new Point (0, 0);
    272.             double[] aspectratio = new double[1];
    273.            
    274.             Calib3d.calibrationMatrixValues (camMatrix, imageSize, apertureWidth, apertureHeight, fovx, fovy, focalLength, principalPoint, aspectratio);
    275.            
    276.             Debug.Log ("imageSize " + imageSize.ToString ());
    277.             Debug.Log ("apertureWidth " + apertureWidth);
    278.             Debug.Log ("apertureHeight " + apertureHeight);
    279.             Debug.Log ("fovx " + fovx [0]);
    280.             Debug.Log ("fovy " + fovy [0]);
    281.             Debug.Log ("focalLength " + focalLength [0]);
    282.             Debug.Log ("principalPoint " + principalPoint.ToString ());
    283.             Debug.Log ("aspectratio " + aspectratio [0]);
    284.            
    285.            
    286.             //To convert the difference of the FOV value of the OpenCV and Unity.
    287.             double fovXScale = (2.0 * Mathf.Atan ((float)(imageSize.width / (2.0 * fx)))) / (Mathf.Atan2 ((float)cx, (float)fx) + Mathf.Atan2 ((float)(imageSize.width - cx), (float)fx));
    288.             double fovYScale = (2.0 * Mathf.Atan ((float)(imageSize.height / (2.0 * fy)))) / (Mathf.Atan2 ((float)cy, (float)fy) + Mathf.Atan2 ((float)(imageSize.height - cy), (float)fy));
    289.            
    290.             Debug.Log ("fovXScale " + fovXScale);
    291.             Debug.Log ("fovYScale " + fovYScale);
    292.            
    293.            
    294.             //Adjust Unity Camera FOV https://github.com/opencv/opencv/commit/8ed1945ccd52501f5ab22bdec6aa1f91f1e2cfd4
    295.             if (widthScale > heightScale) {
    296.                 ARCamera.fieldOfView = (float)(fovx [0] * fovXScale);
    297.             } else {
    298.                 ARCamera.fieldOfView = (float)(fovy [0] * fovYScale);
    299.             }
    300.            
    301.            
    302.            
    303.             invertYM = Matrix4x4.TRS (Vector3.zero, Quaternion.identity, new Vector3 (1, -1, 1));
    304.             Debug.Log ("invertYM " + invertYM.ToString ());
    305.            
    306.             invertZM = Matrix4x4.TRS (Vector3.zero, Quaternion.identity, new Vector3 (1, 1, -1));
    307.             Debug.Log ("invertZM " + invertZM.ToString ());
    308.            
    309.            
    310.             axes.SetActive (false);
    311.             head.SetActive (false);
    312.             rightEye.SetActive (false);
    313.             leftEye.SetActive (false);
    314.             mouth.SetActive (false);
    315.  
    316.  
    317.             mouthParticleSystem = mouth.GetComponentsInChildren<ParticleSystem> (true);
    318.        
    319.         }
    320.  
    321.         // Update is called once per frame
    322.         void Update ()
    323.         {
    324.  
    325.             //Loop play
    326.             if (capture.get (Videoio.CAP_PROP_POS_FRAMES) >= capture.get (Videoio.CAP_PROP_FRAME_COUNT))
    327.                 capture.set (Videoio.CAP_PROP_POS_FRAMES, 0);
    328.  
    329.             if (capture.grab ()) {
    330.                
    331.                 capture.retrieve (rgbMat, 0);
    332.                
    333.                 Imgproc.cvtColor (rgbMat, rgbMat, Imgproc.COLOR_BGR2RGB);
    334.                 //Debug.Log ("Mat toString " + rgbMat.ToString ());
    335.  
    336.  
    337.                 OpenCVForUnityUtils.SetImage (faceLandmarkDetector, rgbMat);
    338.  
    339.                 //detect face rects
    340.                 List<UnityEngine.Rect> detectResult = faceLandmarkDetector.Detect ();
    341.  
    342.                 if (detectResult.Count > 0) {
    343.  
    344.                     //detect landmark points
    345.                     List<Vector2> points = faceLandmarkDetector.DetectLandmark (detectResult [0]);
    346.  
    347.                     if (points.Count > 0) {
    348.                         if (shouldDrawFacePoints)
    349.                             OpenCVForUnityUtils.DrawFaceLandmark (rgbMat, points, new Scalar (0, 255, 0), 2);
    350.  
    351.                         imagePoints.fromArray (
    352.                             new Point ((points [38].x + points [41].x) / 2, (points [38].y + points [41].y) / 2),//l eye
    353.                             new Point ((points [43].x + points [46].x) / 2, (points [43].y + points [46].y) / 2),//r eye
    354.                             new Point (points [33].x, points [33].y),//nose
    355.                             new Point (points [48].x, points [48].y),//l mouth
    356.                             new Point (points [54].x, points [54].y) //r mouth
    357.                                                         ,
    358.                             new Point (points [0].x, points [0].y),//l ear
    359.                             new Point (points [16].x, points [16].y)//r ear
    360.                         );
    361.                                                                        
    362.                                                                        
    363.                         Calib3d.solvePnP (objectPoints, imagePoints, camMatrix, distCoeffs, rvec, tvec);
    364.  
    365.                        
    366.                         if (tvec.get (2, 0) [0] > 0) {
    367.  
    368.                             if (Mathf.Abs ((float)(points [43].y - points [46].y)) > Mathf.Abs ((float)(points [42].x - points [45].x)) / 6.0) {
    369.                                 if (shouldDrawEffects)
    370.                                     rightEye.SetActive (true);
    371.                             }
    372.  
    373.                             if (Mathf.Abs ((float)(points [38].y - points [41].y)) > Mathf.Abs ((float)(points [39].x - points [36].x)) / 6.0) {
    374.                                 if (shouldDrawEffects)
    375.                                     leftEye.SetActive (true);
    376.                             }
    377.                             if (shouldDrawHead)
    378.                                 head.SetActive (true);
    379.                             if (shouldDrawAxes)
    380.                                 axes.SetActive (true);
    381.                                                    
    382.                                                    
    383.  
    384.                             float noseDistance = Mathf.Abs ((float)(points [27].y - points [33].y));
    385.                             float mouseDistance = Mathf.Abs ((float)(points [62].y - points [66].y));
    386.                             if (mouseDistance > noseDistance / 5.0) {
    387.                                 if (shouldDrawEffects) {
    388.                                     mouth.SetActive (true);
    389.                                     foreach (ParticleSystem ps in mouthParticleSystem) {
    390.                                         ps.enableEmission = true;
    391.                                         ps.startSize = 40 * (mouseDistance / noseDistance);
    392.                                     }
    393.                                 }
    394.                             } else {
    395.                                 if (shouldDrawEffects) {
    396.                                     foreach (ParticleSystem ps in mouthParticleSystem) {
    397.                                         ps.enableEmission = false;
    398.                                     }
    399.                                 }
    400.                             }
    401.                                                    
    402.                                                    
    403.                             Calib3d.Rodrigues (rvec, rotM);
    404.                                                    
    405.                             transformationM .SetRow (0, new Vector4 ((float)rotM.get (0, 0) [0], (float)rotM.get (0, 1) [0], (float)rotM.get (0, 2) [0], (float)tvec.get (0, 0) [0]));
    406.                             transformationM.SetRow (1, new Vector4 ((float)rotM.get (1, 0) [0], (float)rotM.get (1, 1) [0], (float)rotM.get (1, 2) [0], (float)tvec.get (1, 0) [0]));
    407.                             transformationM.SetRow (2, new Vector4 ((float)rotM.get (2, 0) [0], (float)rotM.get (2, 1) [0], (float)rotM.get (2, 2) [0], (float)tvec.get (2, 0) [0]));
    408.                             transformationM.SetRow (3, new Vector4 (0, 0, 0, 1));
    409.                                                    
    410.                             if (shouldMoveARCamera) {
    411.  
    412.                                 if (ARGameObject != null) {
    413.                                     ARM = ARGameObject.transform.localToWorldMatrix * invertZM * transformationM.inverse * invertYM;
    414.                                     ARUtils.SetTransformFromMatrix (ARCamera.transform, ref ARM);
    415.                                     ARGameObject.SetActive (true);
    416.                                 }
    417.                             } else {
    418.                                 ARM = ARCamera.transform.localToWorldMatrix * invertYM * transformationM * invertZM;
    419.  
    420.                                 if (ARGameObject != null) {
    421.                                     ARUtils.SetTransformFromMatrix (ARGameObject.transform, ref ARM);
    422.                                     ARGameObject.SetActive (true);
    423.                                 }
    424.                             }
    425.                         }
    426.                     }
    427.                 } else {
    428.                     rightEye.SetActive (false);
    429.                     leftEye.SetActive (false);
    430.                     head.SetActive (false);
    431.                     mouth.SetActive (false);
    432.                     axes.SetActive (false);
    433.                 }
    434.                                        
    435.                 Imgproc.putText (rgbMat, "W:" + rgbMat.width () + " H:" + rgbMat.height () + " SO:" + Screen.orientation, new Point (5, rgbMat.rows () - 10), Core.FONT_HERSHEY_SIMPLEX, 0.5, new Scalar (255, 255, 255), 1, Imgproc.LINE_AA, false);
    436.                                        
    437.                 OpenCVForUnity.Utils.matToTexture2D (rgbMat, texture, colors);
    438.                                        
    439.             }
    440.                    
    441.         }
    442.                
    443.         /// <summary>
    444.         /// Raises the disable event.
    445.         /// </summary>
    446.         void OnDisable ()
    447.         {
    448.             camMatrix.Dispose ();
    449.             distCoeffs.Dispose ();
    450.  
    451.             faceLandmarkDetector.Dispose ();
    452.         }
    453.        
    454.         /// <summary>
    455.         /// Raises the back button event.
    456.         /// </summary>
    457.         public void OnBackButton ()
    458.         {
    459.             #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    460.             SceneManager.LoadScene ("DlibFaceLandmarkDetectorSample");
    461.             #else
    462.             Application.LoadLevel ("DlibFaceLandmarkDetectorSample");
    463.             #endif
    464.         }
    465.                
    466.         public void OnDrawFacePointsButton ()
    467.         {
    468.             if (shouldDrawFacePoints) {
    469.                 shouldDrawFacePoints = false;
    470.             } else {
    471.                 shouldDrawFacePoints = true;
    472.             }
    473.         }
    474.                
    475.         public void OnDrawAxesButton ()
    476.         {
    477.             if (shouldDrawAxes) {
    478.                 shouldDrawAxes = false;
    479.                 axes.SetActive (false);
    480.             } else {
    481.                 shouldDrawAxes = true;
    482.             }
    483.         }
    484.                
    485.         public void OnDrawHeadButton ()
    486.         {
    487.             if (shouldDrawHead) {
    488.                 shouldDrawHead = false;
    489.                 head.SetActive (false);
    490.             } else {
    491.                 shouldDrawHead = true;
    492.             }
    493.         }
    494.  
    495.         public void OnDrawEffectsButton ()
    496.         {
    497.             if (shouldDrawEffects) {
    498.                 shouldDrawEffects = false;
    499.                 rightEye.SetActive (false);
    500.                 leftEye.SetActive (false);
    501.                 mouth.SetActive (false);
    502.             } else {
    503.                 shouldDrawEffects = true;
    504.             }
    505.         }
    506.  
    507.     }
    508. }
     
  42. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    I have not tried it yet, but adapting a low pass filter to the acquired face landmark points may reduce the noise.
     
  43. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Unfortunately An "missing script" error occurs in version 1.0.2.
    You can fix this bug by attaching "WebCamTextureToMatHelper.cs" to "Quad".
    dlibfacelandmark_attach_webcamhelper.png
    Currently, I am applying for a fixed version to AssetStore.
     
  44. Molys

    Molys

    Joined:
    Nov 22, 2016
    Posts:
    1
    Hi,
    Dedug for DlibFaceLandmarkDetectorWithOpenCVSample:
    Eorror: CS0246 failed to find the type or namespace name“FaceLandmarkDetector faceLandmarkDetector”(Are missing using instructions or assembly references ?) DlibFaceLandmarkDetector.CSharp

    Thanks!
     
  45. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Does ”Assets\DlibFaceLandmarkDetector\Scripts\FaceLandmarkDetector.cs” exist?
     
  46. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Dlib FaceLandmark Detector v1.0.3 is now available.

    Version changes
    1.0.3
    [WebGL]Added WebGL(beta) support.(Unity5.3 or later)
    [Common]Fixed missing script error.(WebCamTextureToMatHelper.cs)
    [Common]Added shape_predictor_68_face_landmarks_for_mobile.dat.

     
  47. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Dlib FaceLandmark Detector v1.0.4 is now available.

    Version changes
    1.0.4
    [Common]Updated shape_predictor_68_face_landmarks_for_mobile.dat.
     
  48. rebit

    rebit

    Joined:
    Nov 13, 2014
    Posts:
    133
  49. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Facial Action Unit Recognition, Gaze tracking and Facial Feature Extraction are interesting.

    There seems to be such an example using "Dlib FaceLandmark Detector".
    https://twitter.com/wakasoftware/status/804773538112544768
     
  50. yhiguchi621

    yhiguchi621

    Joined:
    Nov 19, 2016
    Posts:
    1
    I have the same problem now. Please tell me if yours is resolved.