Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Kinect plugin

Discussion in 'Made With Unity' started by bliprob, Nov 21, 2010.

  1. ajie

    ajie

    Joined:
    Jan 7, 2011
    Posts:
    5
    Ok, I've got the plugin works, another things come in minds, the plugin doesn't seems to respond to wrist, ankle head movement.
    My question is, is this an OpenNI features which not supported yet?, or maybe the OpenNI already support it but not the Unity plugin?
     
  2. colr

    colr

    Joined:
    Feb 19, 2010
    Posts:
    2
    hi vtornik23

    i'm trying to apply your modifications to the kinect-unity wrapper, but i cant get the character moving properly.

    it would be great if you could share the project.

    thank you
     
  3. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    ajie to the best of my knowledge the position of those joints is available, but not the rotation. This is a limitation imposed by the NITE middleware, which is the only skeletal tracking option presently available within OpenNI, although in theory others could write their own middleware to do this stuff, but its not easy.
     
  4. vtornik23

    vtornik23

    Joined:
    Jul 1, 2010
    Posts:
    27
    Here it is! :)

    [edited]
    cannot upload the zip archive with project...:confused:

    "Upload of file failed."
     
    Last edited: Jan 11, 2011
  5. colr

    colr

    Joined:
    Feb 19, 2010
    Posts:
    2
  6. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    Apparently there was a bug with some of the recently released stuff on Windows, if anyone downloaded some days ago and were suffering from 'usb device speed' errors then try getting the OpenNI stuff again as its been fixed now.

    I havent seen the Unity wrapper working properly on OS X yet, so inthe meantime if there is anybody out there who wanted to go down the more primitive route of using joint positions sent from another program via OSC to Unity, please note that there is now an OS X version of OSCeleton.

    https://github.com/Sensebloom/OSCeleton
     
  7. ajie

    ajie

    Joined:
    Jan 7, 2011
    Posts:
    5
    I would like to know if there's any method to handle uncaptured bone segment? I mean, when we are get close to the sensor and our legs is undetectable, how do we capture such events in the wrapper?
     
  8. wimeck

    wimeck

    Joined:
    May 26, 2008
    Posts:
    50
    Wow, great work on making this plugin, but also on the clear instructions!
    Bernard, I don't understand your comment on the mirroring:

    An other improvement on this is to change the line <Mirror on="false"/> to <Mirror on="true"/>, so the avatar on screen actually moves right when you move right (instead of the other way around, which is counter-intuitive and not like in the Kinect games).

    I cannot find <Mirror on="false"/> in the code, I have to look in Nite.cs right?
    Thanks!
     
  9. wimeck

    wimeck

    Joined:
    May 26, 2008
    Posts:
    50
    Ah, I found it, the code is in the OpenNI.xml file.
     
  10. the_gnoblin

    the_gnoblin

    Joined:
    Jan 10, 2009
    Posts:
    722
    hi,

    I have the same problem.

    I tried installing the current unstable release of OpenNI and then repeating the installation process described by bernardfrancois. Unfortunately this didn't help. I didn't find a link to unstable release of NITE - where can I find it on primesense site? What else can I do except posting a question in http://groups.google.com/group/openni-dev/topics ?

    thanks,
    Slav
     
  11. wimeck

    wimeck

    Joined:
    May 26, 2008
    Posts:
    50
    Hi Gnoblin,

    I had the same problem, but only needed to reinstal the 'unstable' OpenNI. I don't think there is an unstable version of NITE.
     
  12. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
  13. zweihander111

    zweihander111

    Joined:
    Nov 7, 2010
    Posts:
    20
    Hey guys, just got into this kinect stuff and I really appreciate your comments and effort in this forum, thanks a lot!
    Awesome works!
     
  14. zweihander111

    zweihander111

    Joined:
    Nov 7, 2010
    Posts:
    20
  15. the_gnoblin

    the_gnoblin

    Joined:
    Jan 10, 2009
    Posts:
    722
    Ok, I got kinect working on another computer! ;)

    The sample project for Unity runs at 10 fps on my PC. Is this expected?
     
  16. the_gnoblin

    the_gnoblin

    Joined:
    Jan 10, 2009
    Posts:
    722
    Is it possible to somehow adjust kinect's rotation? (this happens during calibration process on xbox360)
     
  17. lyserdelic

    lyserdelic

    Joined:
    Jan 5, 2011
    Posts:
    6
    Hi


    can anyone help me with the openNi-UnityWrapper, ?and explain me how can I extract the points (x,y.z) information from the activity form the UsersDepth map. (in the nite.cs)

    when I print the "usersHistogramMap[usersDepthMap]" I get values between 0.00 and 1. what is this? is the Z coordenate? how can i map this?
    and how can i get the other 2 coordenates?

    I would like to do sometihing like this , the pointCloud.


    thanks in advance
     
  18. timtek

    timtek

    Joined:
    Jan 28, 2011
    Posts:
    3
  19. amir

    amir

    Joined:
    Oct 5, 2010
    Posts:
    75
    timtek: my wrapper uses magic. :)

    no seriously, the issue with the older openni wrapper is the rendering of the depth camera image is slow.

    and my wrapper lets you use joint orientation if you want to.


    It's still a TODO for me, but I was just working on it for like 4 hours and still can't figure out how to make my Texture2D from a MapData<RGB24Pixel> or IntPtr (as returned by GetImageMapPtr () )

    try again tomorrow
     
  20. priceap

    priceap

    Joined:
    Apr 18, 2009
    Posts:
    274
    Hi Amir -
    Can you clarify what is different about your unitywrapper from the one released on the OpenNI site?

    I see you have a different DLL "OpenNI.net.dll" , rather than "UnityInteface.dll", but I also see there is a OpenNI.net.dll included in the OpenNI installation. Is this the same DLL?

    The unity wrapper released on OpenNI is written by Shlomo Zippel, copyright PrimSense with a GNU license. It also includes all the source code. In your release, the same Nite.cs script seems to initially be used but with that information removed and changes made in the Nite2 and Nite3 scripts.

    The recent questions about it running faster than the OpenNI version seems to be answered by saying that the reason is because you removed the visualization of the depth image and not using the joint rotations. Is that correct to say the the speed increase is due to removing functions that were called in the OpenNI Nite.cs script?

    Also, I am not sure whether we are calling the .cs scripts the "wrapper" or the DLL plugin the "wrapper". I have been assuming the DLL is what would be called the wrapper and the .cs script simply implements it in the application. Is this the correct description?

    The reason I am asking these questions is because I have been working with the unity interface DLL provided on the OpenNI site. I have been getting great results, except because I can look at the source code .h files I can see that only a partial number of functions provided by the OpenNI and NITE SDKs are implemented in the unityinterface.dll. So I start thinking about working with that source code and implementing other functionality, or if your DLL is implementing others or more, but of course there is no source to look at.

    thanks - ap



     
  21. priceap

    priceap

    Joined:
    Apr 18, 2009
    Posts:
    274
    @lyserdelic -
    In the Nite.cs script, the texture is created with an array of colors using SetPixels(usersMapColors).

    The line in the script that sets the color of each pixel is like this
    Code (csharp):
    1. Color c = new Color(usersHistogramMap[usersDepthMap[i]], usersHistogramMap[usersDepthMap[i]], usersHistogramMap[usersDepthMap[i]], 0.9f);
    So "usersHistogramMap[usersDepthMap]" is used for R G and B, to get a greyscale image.
    The the script multiplies the value by a color in order to get the different color user images.

    So you don't need to add the color, but use the grayscale value as a heightmap. The resulting texture gives you the X and Y coordinates essentially, and the grayscale value of the pixel is the Z.

     
    Last edited: Feb 4, 2011
  22. amir

    amir

    Joined:
    Oct 5, 2010
    Posts:
    75
    Priceap: I use a modified Openni.net.dll from the one that you get with OpenNI. The source code for the .NET wrapper is in OpenNI. The changes made to the .NET wrapper are explained in this thread:

    http://groups.google.com/group/openni-dev/browse_thread/thread/5efd037ba79c91fe?pli=1

    I included auto-generated API docs on my git.

    The older UnityInterface.dll only works on Windows and is deprecated by the OpenNI.NET wrapper because the .NET wrapper does contain all the NI functionality. I based my code largely off of Shlomo's code and the UserTracker.NET example.

    The speed increase is because of commenting out UpdateUserMap. I'm working on rendering the camera, depth image and user image to a texture in a way that way that won't totally lag.

     
  23. priceap

    priceap

    Joined:
    Apr 18, 2009
    Posts:
    274
    Thanks Amir - that was very helpful. I will take a look at the openni sdk source for Openni.net.dll code

    The auto-generated docs you mentioned - Is this in "htmldocs"? All that is there is for xnv and xn and the descriptions are blank.

    will you be releasing your modifications of the Openni.net.dll?

    thanks -
     
  24. priceap

    priceap

    Joined:
    Apr 18, 2009
    Posts:
    274
    timtek:
    The Nite.cs script in the OpenNI wrapper calls " UpdateUserMap()" on every frame in the Update function. You don't need this to happen once the user(s) are calibrated. Instead, create a boolean that is set to true when users are being calibrated and turn it off once calibrated. This way the user can still see their pose if you want.
    Doing this and the OpenNI wrapper will run at twice the frame rate (I get 33 instead of 16 on my laptop).

     
  25. timtek

    timtek

    Joined:
    Jan 28, 2011
    Posts:
    3
    Thanks for the explanation guys, helped me get my head round the script a bit more and was able to get my character moving around much smoother.
     
  26. lyserdelic

    lyserdelic

    Joined:
    Jan 5, 2011
    Posts:
    6
    Hi priceap

    thank you very much for your detailed explanation

    I will try to make my pointCloud. I will post it after

    ;)
     
  27. amir

    amir

    Joined:
    Oct 5, 2010
    Posts:
    75
    I wasn't able to get depth and camera data into a Texture2D in Unity from OpenNI without some serious labors in plugin land.

    I don't think a for loop over userDepthMap actually works to load up a Color[] array for setPixels. I output random pixels from the Depth and Camera maps and they're always 0.

    I've been working on a couple other methods. I've seen examples that get a pointer to the pixel's buffer using GCHandle and then passing that to a plugin which fills the texture. I'm trying to get this working now.
     
  28. diabloroxx

    diabloroxx

    Joined:
    Jan 20, 2010
    Posts:
    71
    I am still wondering if we can OFFICIALLY launch Unity Games with Kinect support along with great deal of libraries and drivers to go with it.
     
  29. RIkki

    RIkki

    Joined:
    Feb 8, 2011
    Posts:
    8
    Can you share the project file?. I am new to unity and i am trying to build something like this.
     
  30. RIkki

    RIkki

    Joined:
    Feb 8, 2011
    Posts:
    8
    @vtornik23 can you share the source code?
     
  31. diabloroxx

    diabloroxx

    Joined:
    Jan 20, 2010
    Posts:
    71
  32. RIkki

    RIkki

    Joined:
    Feb 8, 2011
    Posts:
    8
    Animation is not smooth with that wrapper. Rotation is not good
    Anything that can improve the animation will be great. I am looking to get something like sinabad sample
     
  33. amir

    amir

    Joined:
    Oct 5, 2010
    Posts:
    75
    just get rid of the lerp.

    i can't get the depth/camera/ir 640x480 images from OpenNI maps into textures at a high frame rate using SetPixels and Apply.... I've also tried the route of a plugin using glBindTexture and glTexSubImage2D to draw based on the Texture2D's NativeTextureID.

    My tests indicate that I'm able to do for loops and process the data into buffers very quickly but SetPixels and Apply are the slow parts.

    Any advice for getting the data into 2-D textures in Unity?
     
  34. RIkki

    RIkki

    Joined:
    Feb 8, 2011
    Posts:
    8
    What do you mean by lerp?
     
  35. Daniro

    Daniro

    Joined:
    Jan 12, 2011
    Posts:
    3
    Hi all,

    Here is a video of a project that combines the Kinect, Vuzix iWear and
    Unity to create an immersive Virtual Reality experience :
    http://www.youtube.com/watch?v=L6C01tyrhf0.
    The user takes on the role of Superman, which solves the movement
    problem by being able to fly and besides that gives the user cool
    super powers.

    It was a project of three MSc. AI students at the University of
    Amsterdam. We used the OpenNI framework and the NITE wrapper for Unity
    to recognise the gestures that allow for the interaction with the
    digital environment. Our focus was on the recognition of gestures and
    the interaction with the world, so we just downloaded all the models
    from internet (http://artist-3d.com/, specifically).

    The kinect tracks the body movement, the Vuzix iWear 920 tracks the
    head movement and provides stereoscopic view (through iZ3D drivers -
    which nest themselves in several videoprograms, so be careful).

    We will publish the code when we get a 'go' from our advisors (if
    anyone is interested). We enjoyed making it and hope you enjoy
    watching the video.

    Kind regards,

    Daniël Karavolos
     
  36. diabloroxx

    diabloroxx

    Joined:
    Jan 20, 2010
    Posts:
    71
    Wow, that is really cool. Am looking to make my Master's project using Unity + Kinect as well. But am tied down by the graphics as it is an individual project and am not that good at artwork. Will have to grab some off the internet and design the level of the project. But your video offered lots of ideas which can go into my work.
     
  37. Artknyazev

    Artknyazev

    Joined:
    Feb 1, 2011
    Posts:
    2
    We'll be waiting for code :) Great!! I hope it would be soon!
     
  38. Daniro

    Daniro

    Joined:
    Jan 12, 2011
    Posts:
    3
    I am glad to hear that this inspired you. We have decided to upload our full project, including assets, maybe that will give you even more ideas. I can recommend the site we used for assets. You can really find a lot there. I hope your project will result in a cool hack, be sure to post it here when you're done! :D
     
  39. RIkki

    RIkki

    Joined:
    Feb 8, 2011
    Posts:
    8
    By when do you expect to get the go ahead for the source code?
     
  40. diabloroxx

    diabloroxx

    Joined:
    Jan 20, 2010
    Posts:
    71
    Yeah please suggest few sites. Although google has been my guide until now, I would always appreciate all the help I get. I need to hack the project quite a bit to make sufficient changes to the way Unity Wrapper works so that it appears different (with lots of help of course). Am coming up with a game design document along with a story board for the project.
     
  41. Todilo

    Todilo

    Joined:
    Feb 1, 2011
    Posts:
    88
    Anyone got any tips if I want to develop some kind of gesture recognision syste. Like waving a letter with your hand.

    Using unitywrapper
     
  42. RIkki

    RIkki

    Joined:
    Feb 8, 2011
    Posts:
    8
    How to make a person grounded with Kinect?
     
  43. priceap

    priceap

    Joined:
    Apr 18, 2009
    Posts:
    274
    The unitywrapper has already implemented the "pause pose" recognition in the NITE sdk. Once you have the kinect calibrated to a user, aim both arms forward and then cross them so your left hand is on the right and right hand on the left. You will see the debug log printing out a progress counter and then announcing the pause pose is complete.

    A simple pose gesture recognition like this could be done by watching the local position of the hands relative to the torso, or watching rotation values of the arms and elbows, etc., perhaps with a range threshold and a timer to wait if the pose is held for certain duration.

    To recognize gestures in motion, I have been recording the rotation values of the joints into an array for a number of individual poses, and then watch if the skeleton moves through those poses in the same sequence to trigger a gesture.

    There are certainly more sophisticated ways to do it, but the use of a pose or motion "library" used as a lookup is a basis for many solutions.

     
  44. priceap

    priceap

    Joined:
    Apr 18, 2009
    Posts:
    274
    Keeping the feet of a character locked to the ground when the hip position is "unknown" is often solved with inverse kinematics, but probably not the solution here.

    One starting approach could be to always assume that the lowest foot of the two is on the ground (since a person cannot lift both feet off the ground at the same time). Then always adjust the height of the hip root of the skeleton with an offset to keep that foot on the ground plane. It would probably require some smoothing filtering and thresholds for watching which foot is lower when both are near the same (both on ground).

    Edit: Of course, also something to account for jumping - like a velocity threshold, etc.
     
    Last edited: Feb 13, 2011
  45. zweihander111

    zweihander111

    Joined:
    Nov 7, 2010
    Posts:
    20
    Hey dianlo roxx, I can help you with art stuff if you're interested. I know nothingof programing but would like to learn how to implement kinect. Just contact me!
     
  46. diabloroxx

    diabloroxx

    Joined:
    Jan 20, 2010
    Posts:
    71
    Hey Thanks a lot for the help. I will appreciate that. For now, I have just made my computer work with Kinect using the Unity Wrapper example. Will be coming up with the design in few days now. I apologize to the author for hijacking his thread.
     
  47. zweihander111

    zweihander111

    Joined:
    Nov 7, 2010
    Posts:
    20
    Excellent, just let me know.
    ANy chances you could post the direct links where you downloaded nite and the sensor plugins? I can't run the demo and want to reinstall these to see if the problem is fixed...
    thx in advance!
     
  48. dart

    dart

    Joined:
    Jan 9, 2010
    Posts:
    211
  49. ModesttreeMedia

    ModesttreeMedia

    Joined:
    Nov 16, 2010
    Posts:
    10
    Hey Priceap,

    where are you getting the rotation value data from to record?

    Thanks
     
  50. Artknyazev

    Artknyazev

    Joined:
    Feb 1, 2011
    Posts:
    2
    Hi,Daniro !
    Have you any positive news about your project?