Search Unity

Kinect with MS-SDK

Discussion in 'Assets and Asset Store' started by roumenf, Dec 17, 2013.

  1. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    If you mean the 'Squat' gesture - yes, it recognizes squatting (as action), not staying in squat-pose.
     
  2. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    As far as I remember, this asset does not track the head rotation, because only the face-tracking provides this info. I.e. you will need the FaceTrackingManager from the KinectExtras-asset or use the K2-asset (it has all in one). There are API-functions to get the current head position and rotation. In the K2-asset you can get the head orientation the same way as for all other joints, too.
     
  3. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi, first of all, congratulations on the bold attempt to write a gesture detection routine! There are still not many daring to do it :) Here are some thoughts about the code:
    1. PoseCompleteDuration is usually used to check if the user stays in a pose (raise hand, T-pose, etc.) long enough, i.e. it was not just an accidental hand raise. So, this constant is not related to your case. You did the invocation of CheckPoseComplete() correctly.
    2. The if() in case0 should check the same as in case1, i.e. 'jointsTracked[rightKneeIndex] && jointsTracked[rightHipIndex] && Mathf.Abs(jointsPos[rightKneeIndex].y - jointsPos[rightHipIndex].y) < 0.2f', as to me. Why do you sum up the Y-positions and compare them to 1.8f?!
    3. Why are these gestures separated as left and right? I mean, I cannot imagine how anyone could run with one leg only. If you look at case1 more carefully, you will see why just staying with knee high around the hips triggers the gesture.

    As to me you should add one more state and check the same for the left leg, then change the state back to 1 to check the right leg again, and so on, until the user stops running. Also, this should be a continuous gesture, i.e. only its progress should change (for instance 0.7f in case1 and 0.8f in case2) while the user runs. In this case, invocation of CheckPoseComplete() is no more needed, because the gesture never completes. The only important thing is if it still continues or not. I mean, instead of CheckPoseComplete(), something like this:
    Code (CSharp):
    1. // report progress 70%, go to state 2
    2. gestureData.timestamp = timestamp;
    3. gestureData.progress = 0.7f;
    4. gestureData.state = 2;
    5.  
     
    superaldo666 likes this.
  4. YoraeRasante

    YoraeRasante

    Joined:
    Nov 22, 2014
    Posts:
    11
    No, doesn't seem to work... Should I try without the rigidbody?

    Any suggestions on detecting staying in squat?

    As K2 you mean the Kinect version for XBoxOne? Unfortunatelly I have no idea where I could get an adapter to use it for PC, so that option is out... That said, I got the idea of using the shoulder rotation like in your sample example instead.

    I did it in one myself (I used lowering the leg instead of raising because I thought it made more sense, I based it on SwipeDown) but used a float to check which leg was being used and later in the listener to only move the character on the next step if the leg used next is the other one, or if there was a 3 second wait.
    That said, there is a small delay between the detection of steps, is that normal or is there a problem with my gesture?
     
  5. superaldo666

    superaldo666

    Joined:
    Aug 30, 2013
    Posts:
    17
    Thx Roumenf,

    I recently just read your reply because had a family accident, well I try your suggestion but does not work me, can you see my code and help me with something, I just want to run naturally, thank you in advance

    Code (CSharp):
    1.         case Gestures.Run:
    2.             switch(gestureData.state)
    3.             {
    4.             case 0:  // gesture detection - phase 1
    5.                 if(jointsTracked[rightKneeIndex] && jointsTracked[rightHipIndex] &&
    6.                     Mathf.Abs(jointsPos[rightKneeIndex].y - jointsPos[rightHipIndex].y) < 0.2f)
    7.                 {
    8.                     SetGestureJoint(ref gestureData, timestamp, rightKneeIndex, jointsPos[rightKneeIndex]);
    9.                     gestureData.progress = 0.3f;
    10.                 }
    11.  
    12.                 else if(jointsTracked[leftKneeIndex] && jointsTracked[leftHipIndex] &&
    13.                     (jointsPos[leftKneeIndex].y + jointsPos[leftHipIndex].y) > 0.2f)
    14.                 {
    15.                     SetGestureJoint(ref gestureData, timestamp, rightKneeIndex, jointsPos[leftKneeIndex]);
    16.                     gestureData.progress = 0.3f;
    17.                 }
    18.                 break;
    19.              
    20.             case 1:  // gesture complete
    21.                 if((timestamp - gestureData.timestamp) < 1.5f)
    22.                 {
    23.                     bool isInPose = jointsTracked[rightKneeIndex] && jointsTracked[rightHipIndex] &&
    24.                         (jointsPos[rightKneeIndex].y - jointsPos[rightHipIndex].y) > 0.2f &&
    25.                         jointsTracked[leftKneeIndex] && jointsTracked[leftHipIndex] &&
    26.                         Mathf.Abs(jointsPos[leftKneeIndex].y - jointsPos[leftHipIndex].y) < 0.2f;
    27.                  
    28.                     if(isInPose)
    29.                     {
    30.  
    31.                         gestureData.timestamp = timestamp;
    32.                         gestureData.progress = 0.7f;
    33.                         gestureData.state = 2;
    34.  
    35.                     }
    36.                     else
    37.                     {
    38.                         // cancel the gesture
    39.                         SetGestureCancelled(ref gestureData);
    40.                     }
    41.                     break;
    42.                 }
    43.                 break;
    44.  
    45.             case 2:  // gesture complete
    46.                 if((timestamp - gestureData.timestamp) < 1.5f)
    47.                 {
    48.                     bool isInPose = jointsTracked[rightKneeIndex] && jointsTracked[rightHipIndex] &&
    49.                         (jointsPos[rightKneeIndex].y + jointsPos[rightHipIndex].y) > 0.2f &&
    50.                         jointsTracked[leftKneeIndex] && jointsTracked[leftHipIndex] &&
    51.                         (jointsPos[leftKneeIndex].y + jointsPos[leftHipIndex].y) > 0.2f;
    52.                  
    53.                     if(isInPose)
    54.                     {
    55.  
    56.                         Vector3 jointPos = jointsPos[gestureData.joint];
    57.                         CheckPoseComplete(ref gestureData, timestamp, jointPos, isInPose, 0f);
    58.  
    59.                     }
    60.                     else
    61.                     {
    62.                         // cancel the gesture
    63.                         SetGestureCancelled(ref gestureData);
    64.                     }
    65.                     break;
    66.                 }
    67.                 break;
    68.             }
    69.             break;
     
    Last edited: Oct 2, 2015
  6. YoraeRasante

    YoraeRasante

    Joined:
    Nov 22, 2014
    Posts:
    11
    Me again, sorry I'm bothering you so much, my finals paper is supposed to be about this but I feel like you are doing a lot of the work...
    Well, besides my problems with the squatting position (the avatar floating and needing to change the gesture detection so it detects if the player stays squatted), I'm having another, unrelated problem...

    I put the camera inside the avatar, so that when I move it the camera goes with it (I was trying to use a script for this earlier, but this is much better) but... the avatar goes forward, sideways and all, but when I rotate it the camera goes, the commands to move forward sends it forward... but graphics-wise it did not rotate. Any idea what I need to do to fix this?
     
  7. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi superaldo, sorry for the delay. Here is my suggestion. I preferred to compare one knee with the other, instead of with the hips. Please ask, if you don't understand anything.

    Code (CSharp):
    1.             case Gestures.Run:
    2.                 switch(gestureData.state)
    3.                 {
    4.                 case 0:  // gesture detection - phase 1
    5.                     // check if the left knee is up
    6.                     if(jointsTracked[leftKneeIndex] && jointsTracked[rightKneeIndex] &&
    7.                            (jointsPos[leftKneeIndex].y - jointsPos[rightKneeIndex].y) > 0.1f)
    8.                     {
    9.                         SetGestureJoint(ref gestureData, timestamp, leftKneeIndex, jointsPos[leftKneeIndex]);
    10.                         gestureData.progress = 0.3f;
    11.                     }
    12.                     break;
    13.                    
    14.                 case 1:  // gesture complete
    15.                     if((timestamp - gestureData.timestamp) < 1.0f)
    16.                     {
    17.                         // check if the right knee is up
    18.                         bool isInPose = jointsTracked[rightKneeIndex] && jointsTracked[leftKneeIndex] &&
    19.                             (jointsPos[rightKneeIndex].y - jointsPos[leftKneeIndex].y) > 0.1f;
    20.                        
    21.                         if(isInPose)
    22.                         {
    23.                             // go to state 2
    24.                             gestureData.timestamp = timestamp;
    25.                             gestureData.progress = 0.7f;
    26.                             gestureData.state = 2;
    27.                         }
    28.                     }
    29.                     else
    30.                     {
    31.                         // cancel the gesture
    32.                         SetGestureCancelled(ref gestureData);
    33.                     }
    34.                     break;
    35.                    
    36.                 case 2:  // gesture complete
    37.                     if((timestamp - gestureData.timestamp) < 1.0f)
    38.                     {
    39.                         // check if the left knee is up again
    40.                         bool isInPose = jointsTracked[leftKneeIndex] && jointsTracked[rightKneeIndex] &&
    41.                             (jointsPos[leftKneeIndex].y - jointsPos[rightKneeIndex].y) > 0.1f;
    42.                        
    43.                         if(isInPose)
    44.                         {
    45.                             // go back to state 1
    46.                             gestureData.timestamp = timestamp;
    47.                             gestureData.progress = 0.8f;
    48.                             gestureData.state = 1;
    49.                         }
    50.                     }
    51.                     else
    52.                     {
    53.                         // cancel the gesture
    54.                         SetGestureCancelled(ref gestureData);
    55.                     }
    56.                     break;
    57.                 }
    58.                 break;
    59.  
     
    superaldo666 likes this.
  8. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    I don't really understand how you've put the camera inside the avatar. If I were you, I would parent it to an avatar's joint. For instance to the neck or head. Then it will move and rotate together with its parent joint.

    Look, "Kinect with MS-SDK" is a free, open-source asset and I have no obligation to provide any support to its users. Despite this, I do it when I have some free minutes, hence can't answer more than 1-2 questions per week. The full source is there. Just try to research it a bit, in order to resolve the issues you have.
     
  9. superaldo666

    superaldo666

    Joined:
    Aug 30, 2013
    Posts:
    17
    Thanks Roumenf, now I'm understand better
     
    roumenf likes this.
  10. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    I forgot to mention, this Run-gesture is continuous, i.e. it continues while you're running. In the gesture listener, don't process it in GestureComplete(), but in GestureInProgress() when (progress > 0.5f). The progress is greater than 0.5 only when the gesture is in states 1 (progress = 0.7f) or 2 (progress = 0.8f).
     
  11. Anagashi

    Anagashi

    Joined:
    Dec 3, 2012
    Posts:
    9
    HI there,

    Thank you for allowing your plugin to be free as well, it helps a lot.

    Just wondering that, is it possible that the machine able to display what Kinect is capturing, like a normal video. (Not just only show the user only).

    Just need some help on this part, please reply soon.

    Please and thank you.
     
  12. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Enable 'Compute color map' and 'Display user map'-settings of the KinectManager-component, and optionally disable 'Display User Map'.
     
  13. Whitt1985

    Whitt1985

    Joined:
    Aug 2, 2015
    Posts:
    10
    Hi mate, any more updates re the NuiInitalize failed and at KinectManager.Awake () errors. I read from your page that its not a unity error more a Microsoft issue. I've tried both Unity 5 and 4.6 and using windows 10 and a USB 2.0 socket. Have you heard of any other work arounds at all? Thanks for bring the asset to the store its very easy to set up, just wish the Microsoft drivers wouldn't crash now!
     
  14. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    All updates on this issues I post here: http://rfilkov.com/2013/12/16/kinect-with-ms-sdk/#ki As you can see, the last workaround suggestion was to reinstall the Kinect SDK and then restart the machine. Doesn't this work in your case?
     
  15. Whitt1985

    Whitt1985

    Joined:
    Aug 2, 2015
    Posts:
    10
    I'll give it a try tonight and report back. The problem has happened while using the FAAST and vrpn framework as well (running for a minute or two then stops and freezes), its part of the reason I found your asset as I thought it was that program causing the problem. It is clear that its more than that, least this way I can see the error though! I'm using a xbox 360 Kinect with a separate power adaptor if that makes any difference.

    Thanks James
     
  16. Anagashi

    Anagashi

    Joined:
    Dec 3, 2012
    Posts:
    9
    Again roumenf, thanks for the tips, now I wonder is there anyway to match the kinect camera with the normal camera?

    Edit: Problem solved, sorry for interrupting.
     
    Last edited: Oct 22, 2015
  17. Whitt1985

    Whitt1985

    Joined:
    Aug 2, 2015
    Posts:
    10

    Hi Mate, I gave this a try last night (reinstall and restart) and still no joy. I even checked that it wasn't my programming and booted up one of your examples, still crashes after 2 minutes or so. I'm gonna keep tinkering and see if things like antivirus as causing problems, also got a windows 7 laptop to try as well. If anyone has had a similar problems with windows 10 and Kinect for the xbox 360 and found any solutions it would be most appreciated!.
     
  18. Anagashi

    Anagashi

    Joined:
    Dec 3, 2012
    Posts:
    9
    I just have a small request from the community, is there any explanation or tutorial how the gesture works?

    I read and look at the Demo scene and still not really understand.

    I hope the community can help a newbie here with little experiences with Kinect.

    Please and thank you.

    I figured out how it works now. Sorry for interrupting.
     
    Last edited: Oct 23, 2015
    roumenf likes this.
  19. YoraeRasante

    YoraeRasante

    Joined:
    Nov 22, 2014
    Posts:
    11
    I figured out what my problems were. The not-turning was on AvatarController (had to change how the bones are redrawn) and another was because of the collider I was using was on the complete model and not on the joints so it wasn't moving when the avatar was, and since colliders are invisible in-game...

    I know, and I appreciate it. Sorry if I sounded like I expected you to do it, it is just that as I said I'm basing my finals paper on this project but neither I nor my advising professor knew much about unity or kinect itself, we are learning as we work, but you had the experience to make this pack so I hoped you could help. Sorry if it looked like I was pushing all the work on top of you.

    That said, I'm having a bit of trouble making both the Kinect Manager and Facetracking Manager work at the same time. On the test mode the avatar doesn't seem to follow the moves most of the time (unless I start with facetracking deactivated and turn it back on during the test), while on a built game the facetracking doesn't seem to want to work. In the future I'll need both plus the interaction manager working at once (since interaction manager detects the "gripping hand" gesture). Is it a hardware problem on my end, a kinect limitation or just a problem I can deal with with the proper code alterations?
     
  20. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    I need to look at the project, in order to understand what went wrong. If you like (and if you may), zip your project and send it to me via WeTransfer, so I could take a look.
     
  21. YoraeRasante

    YoraeRasante

    Joined:
    Nov 22, 2014
    Posts:
    11
    It is sending right now. Thank you for your time.
     
  22. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
  23. YoraeRasante

    YoraeRasante

    Joined:
    Nov 22, 2014
    Posts:
    11
    Ah, I see! I spent a while working with just the KinectManager, I forgot to do this when I added the Facetracking...

    But now the Facetracking doesn't seem to work, or at least the script to track the head direction... The only changes I made were the ones you told me to there...
     
  24. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    You probably forgot to enable 'Compute color map'-setting of the KinectManager-component. The color camera stream is needed for the face-tracking to work.
     
  25. YoraeRasante

    YoraeRasante

    Joined:
    Nov 22, 2014
    Posts:
    11
    Unfortunatelly, no... I already enabled it...
    I'll try restarting unity and see if it helps...

    [EDIT] It worked now. Thanks for your time.
     
    Last edited: Oct 27, 2015
    roumenf likes this.
  26. kingandroid

    kingandroid

    Joined:
    Nov 13, 2014
    Posts:
    9
    Hello, I have a trivial question, is it possible if I'm displaying UserMap (the one with red, blue, green, magenta color) and set the PrimaryUser color to be the only one unique (e.g. primary user is red, the others are blue and if primary user changes, the color also updates). I've searched through the code and it seems like it has correlation between the shader and bodyIndex but I have no idea how to change it. I managed to do this for the skeleton though.

    If anyone can help me, would be appreciated
    Thanks
     
  27. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    I believe this is a Kinect v2-related question. And it is not so trivial, as it looks like. This is a feature that will come with the upcoming update of the K2-asset. The primary user will be always yellow and tracking users with their player indices will be improved. If you are in a hurry, please contact me by e-mail, mention your invoice # and ask me to send you an intermediate version of the asset.
     
  28. kingandroid

    kingandroid

    Joined:
    Nov 13, 2014
    Posts:
    9
    Hey roumenf, thanks for the reply. I have sent you the email and the invoice number
     
  29. iajbenho

    iajbenho

    Joined:
    Dec 10, 2015
    Posts:
    4
    hi im using your plugin i'm having problem detecting the closest user cause im using the background removal and limit the track users is one and it works fine but when another user comes in and its closer to the kinect the background removal doesn't come off to the first user it stay on him even if the the second user is closer to the kinect how can i fix that ?
     
  30. iajbenho

    iajbenho

    Joined:
    Dec 10, 2015
    Posts:
    4
    hi im using your plugin i'm having problem detecting the closest user cause im using the background removal and limit the track users is one and it works fine but when another user comes in and its closer to the kinect the background removal doesn't come off to the first user it stay on him even if the the second user is closer to the kinect how can i fix that ?
     
  31. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    This is the expected behavior - to stick to the already found users. If you don't like this, don't limit the number of users in KM, but set the PlayerIndex-setting of BackgroundRemovalManager-component to 0 and modify the code of BackgroundRemovalManager-script a bit. Instead of 'userID = kinectManager.GetUserIdByIndex(playerIndex);', determine and use the userID of the closest user, on each Update.

    And, please don't send me messages on all possible channels. One is more than enough.
     
  32. Hasanddd

    Hasanddd

    Joined:
    Jan 9, 2016
    Posts:
    1
    Hi, firstly thank your sharing. My question is that:
    How can I make character's foot run / walk animation infinitly which is independent from real body motion.
     
  33. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    See this tip here: http://rfilkov.com/2015/01/25/kinect-v2-tips-tricks-examples/#t11 It is for the K2-asset, but the functionality in the K1-asset is similar. What may be missing is the LateUpdateAvatars-setting, which means that UpdateAvatar()-function should be called in LateUpdate(), instead of in Update().
     
  34. griffin2000

    griffin2000

    Joined:
    Mar 14, 2013
    Posts:
    2
    Did anyone ever find a work around to this? I get the same issue on two different machines.
     
    roumenf likes this.
  35. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    The best workaround so far is to use the K1-asset with Unity 4.x. The Unity support staff (in my premium support period) reported that the issue is in the Kinect driver. Microsoft doesn't provide support for this sensor any more. The only thing I could think of, is to debug, locate and fix the issue myself, on binary level in the respective dll, but I'm not angry enough yet :)
     
  36. ashokbugude

    ashokbugude

    Joined:
    Feb 10, 2016
    Posts:
    20
    I am using unity 5, windows 7 , kinect v1 sdk and everytime I get the below error after a minute.Pls tell how to solve it

    NuiInitialize Failed - Device is not genuine.
    UnityEngine.Debug:LogError(Object)
    KinectManager:Awake() (at Assets/KinectScripts/KinectManager.cs:1036)

    System.Exception: NuiInitialize Failed
    at KinectManager.Awake () [0x00023] in C:\Users\admin\unityProjects\ABC\Assets\KinectScripts\KinectManager.cs:895
    UnityEngine.Debug:LogError(Object)
    KinectManager:Awake() (at Assets/KinectScripts/KinectManager.cs:1037)
     
  37. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    I already answered this question by e-mail, but here is the info for the others that might be interested: No way to solve it. It is up to Microsoft. Look here for more info: http://rfilkov.com/2013/12/16/kinect-with-ms-sdk/#ki

    What I would suggest is:
    1. Try to uninstall everything that is Kinect-related (Kinect SDK, OpenNI, NiTE, Zigfu, etc.), restart, then reinstall the Kinect SDK only.
    2. If this doesn't help, downgrade your project to Unity 4.6.x.
     
  38. Psykomusic

    Psykomusic

    Joined:
    Mar 2, 2016
    Posts:
    3
    Hi everybody,
    I tried to use Kinect with MS-SDK ( thie one https://www.assetstore.unity3d.com/en/#!/content/7747) but, when i tried to execute the first scene i had an error, :
    NuiInitialize Failed - Device is not connected.
    UnityEngine.Debug:LogError(Object)
    KinectManager:Awake() (at Assets/KinectScripts/KinectManager.cs:1014)
    ofc my kinect is connected and i can see that with the kinect studio from Window.
    I use Kinect v2 maybe this it the problem. Kinect v2 is compatible with the asset or i had to buy the asset for Kinect v2 ( i just want to try, $20 seems expensive for just one try)?
    If my kinect is compatible why i have this error ?
    thank all :)
     
  39. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Hi, the asset you tried is for Kinect v1 only. That's why you get this error. The other one (that costs $20) works with Kinect v2 and v1 and has more examples. The scripts and components are similar. You can also get it free, if you are a student. But if you buy it and you are not satisfied, just tell me and I'll send you back your $20 via Paypal. So, don't worry ;)
     
  40. baabee

    baabee

    Joined:
    Feb 28, 2016
    Posts:
    2
    Hi all...
    I am using MS-SDK and Unity engine with Kinect V1. I have a main camera, 2 buttons and a cursor in my scene. Script named as Kinect manager and gesture listeners are used for main camera . Good thing is my hand cursor is working perfectly with hand gestures but i dont know how to make a cursor click or button click with cursor gestures to move on to the next scene.
    Can any one help me regarding this.
    my email id is: siz_baabee@live.com if any one can send me any sample project having just 2 scenes to switch between them on button click. or just tell me what to do ...
     
  41. liquify

    liquify

    Joined:
    Dec 9, 2014
    Posts:
    187
    @roumenf : Firstly, thanks a lot for this great free plugin! I just wanna ask, is it possible to use Kinect with an external camera?

    Because the Kinect camera resolution is low, I purchased a Logitech HD webcam, to use it with the Kinect. Do you know how to replace the Kinect camera with an external webcam?
     
  42. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    The easiest way would be to enable 'Control mouse cursor'-setting of the KinectManager-component. Regarding the multi-scenes setup, please read 'Howto-use-KinectManager-across-multiple-scenes.pdf'-document in the package.
     
  43. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    I suppose, just use the image from the camera's WebcamTexture instead of the Kinect camera image :) Or maybe I didn't understand what you mean...
     
  44. liquify

    liquify

    Joined:
    Dec 9, 2014
    Posts:
    187
    Thanks, it works
     
    roumenf likes this.
  45. tomihr2

    tomihr2

    Joined:
    Oct 25, 2010
    Posts:
    29
    Hello,

    Is it possible to mount kinect to the ceiling, and then using depth data interact with 3D objects colider, sitting on 3D plane.
    I would like that only specific depth is interacting with object.

    Thanks
     
  46. lucasfmarin

    lucasfmarin

    Joined:
    Sep 10, 2015
    Posts:
    2
    Hello roumenf, I do not want to use the avatar, only the camera that sits in the corner. How do I change her position so that it is in the center of the screen?
     
  47. ashokbugude

    ashokbugude

    Joined:
    Feb 10, 2016
    Posts:
    20
    Hi

    Can I know if an application built using this package would run/work in linux ,unity ? . ie can I do hand tracking in linux
    If so what packages/softwares do I need to install. For Eg. What needs to be installed in linux in replacement to 'Kinect for Windows SDK' ?
    If not , is there any alternative to it.

    Thanks
     
  48. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Why not. You can convert the depth data to space points. There was a mapping function for that, as far as I remember. Then set some of these points as centers of predefined colliders. I wouldn't make them all colliders though, because: 1. These will be least 320x200=64000 colliders (or 640x480); 2. the depth within a region doesn't change so drastically, so you can only take the nearest points (with lowest depth value) within a region.
     
  49. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    Sorry, I couldn't understand the problem.
     
  50. roumenf

    roumenf

    Joined:
    Dec 12, 2012
    Posts:
    635
    I think you already found it - OpenNI2/NiTE2. For hand tracking only there are even better options, as to me. Look at the LeapMotion and RealSense-sensors. Both have Unity samples and multi-platform support.