Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Full Body Immersion

Discussion in 'AR/VR (XR) Discussion' started by Partel-Lang, Jan 6, 2015.

  1. Partel-Lang

    Partel-Lang

    Joined:
    Jan 2, 2013
    Posts:
    2,548
    I'm working on some OVR demos to be added to Final IK and it would be nice to get some feedback on the way..

    Here is a very basic demo (Win, DK2), mapping a standing animated 3D character to the positional tracking of DK2, using Full Body Biped IK.

    Don't forget to look down. ;)

    I'm planning to add movement, interactions and see if it makes any sense to aim weapons like that later...

    Cheers,
    Pärtel
     
  2. Thomas-Pasieka

    Thomas-Pasieka

    Joined:
    Sep 19, 2005
    Posts:
    2,174
    This is rather cool mate! Works pretty well for me. Actually rather immersive as the knees bend when I bend mine! Goob job so far. Let's see more :)
     
  3. Partel-Lang

    Partel-Lang

    Joined:
    Jan 2, 2013
    Posts:
    2,548
    Thanks for the feedback, Thomas! :)

    I've been working on some interactions.

    Its quite convenient I'd say, I wish I could operate stuff by looking at it in real life. :)

    Note that when you have picked up the phone, you can still keep interacting with other stuff with the left hand.

    Cheers,
    Pärtel
     
  4. Binary42

    Binary42

    Joined:
    Aug 15, 2013
    Posts:
    207
    Holy Moly, take my money! ( Ok, you do sales do you? ;) )

    Is it possible to do IK from any body point or just the joints?
    Im asking because this would open the whole universe of tracking device options like Six Sense STEM or the MYO.
    Even two MYO's + Rift would be enough then to do full body IK.

    Edit: ye okay ... who cares about feet. ;)
     
  5. sipon

    sipon

    Joined:
    Feb 8, 2009
    Posts:
    143
    wow ! great demos, can't wait to see more !
     
  6. zipper

    zipper

    Joined:
    Jun 12, 2011
    Posts:
    89
    Hi Partel,

    I am very interested in testing your code. This is exactly what i purchased FinalIK for!! I am mainly interested in mapping a standing animated 3D character to the positional tracking of DK2.

    Is that solution something you could share here?

    Best,
    zipper
     
  7. Pascal-Serrarens

    Pascal-Serrarens

    Joined:
    Jan 14, 2013
    Posts:
    40
    Really wonderful! and I must say that your IK is better than what I use in my asset InstantVR (of course :)) I would stay away from using animations on the avatars which are not derived from the real body movements. I may decrease immersion.
    Be careful, this will drag you into something very exiting and time-consuming!

    to Binary42: As far as I know, MYOs won't do the job. They can detect gestures, but do not have positional or rotational information about the hands/arms. Currently, they best option is Razer Hydra and yes: I'm really looking forward to STEM!
     
  8. Binary42

    Binary42

    Joined:
    Aug 15, 2013
    Posts:
    207
    Pascal, yes it has positional and rotational position of the lower arm, that's why im interested if the IK would work on bones.
    Edit: As Pascal pointed out via pm there is indeed now world position from the Myo. Thanks!
    Basically, if u don't need gestures, you can strap a mobile phone to your arm as well. ;)
     
    Last edited: Jan 25, 2015
  9. Partel-Lang

    Partel-Lang

    Joined:
    Jan 2, 2013
    Posts:
    2,548
    Hi, sorry for the delay,

    Its basically possible to add any number of controllers to any body part. Perhaps not "out of the box", but essentially its possible. I'm also hoping STEM, Perception Neuron and other controllers will not take much longer. Can't wait to prove my point. :)

    Hi, Zipper,
    I had to change some things in FinalIK to make that possible, mostly the Interaction System. It will be included in FinalIK 0.5 that I was planning to upload as soon as I'm done with this VR stuff. But if you send me your invoice nr in a private conversation, I will send you what I have right away.

    Thanks! About the idle animation, I agree the current one has too much motion, but without any animation, the dude would be just static, which would also decrease immersion. Actually a very interesting thing I noticed, when you have the headset on for long enough and you keep looking at your virtual self, you'll start to unknowingly mimic the animation, which will make it feel quite cool. It might be just me, but it's something I noticed... interesting... :)

    It is time-consuming alright, but as you said, also very interesting. :)
    At the moment I'm working on aiming weapons. There are a couple of solutions, one is basically having the gun "parented" to the headset and aiming with your head, which is quite accurate and comfortable. Another way is to aim the gun with the mouse, then you will actually have to move your head if you wish to look through the scope. It's not that quick and comfortable, but neither is aiming real rifles. It feels almost like the real thing though. :)

    I'm worried about health risks though, as you'll have to keep one of your eyes closed for aiming and the other exposed to the Rift, it can't be good for long periods of time. It feels worse after than it does with normal Oculus use. As the rifle is always very close to the camera, it also very important to have the headset properly calibrated.

    But I'll post some aiming demos soon. :)

    Cheers,
    Pärtel
     
  10. Binary42

    Binary42

    Joined:
    Aug 15, 2013
    Posts:
    207
    That will make things A LOT easier :D
     
  11. Pascal-Serrarens

    Pascal-Serrarens

    Joined:
    Jan 14, 2013
    Posts:
    40
    May be an interesting way to learn how to dance :)
    But seriously: the physiological effects of immersion are one of the most interesting things of VR development. The most interesting thing I have now is virtual weight. I have an experimental implementation which calculates the physics effect of heavy objects on you body movements. If you handle a heavy sledgehammer for example, you won't be able to lift it above you heady easily. So you see the hammer acting like it is heavy, you know it is heavy and somehow... you are starting to feel that it is heavy, your arms are getting tired from lifting it. Sort of a reverse Occam's Razer. Magical.

    With the gun, I have tried your options too, but nothing beats actual hand tracking. Having a Razer Hydra following the hand (and gun), pulling the trigger. Unfortunately not everyone has one, so I have a fall-back mechanism which uses mouse movements (or right stick on Xbox controller if they choose to use that) for aiming.

    Never did close one eye like you said, maybe because I have no experience with real rifles and how to use them?

    Pascal.
     
  12. timmerish

    timmerish

    Joined:
    Feb 6, 2014
    Posts:
    5
    Very cool! Would be great if it had improved support for sitting down, since that's what most people will be doing in VR. I did see that simply sitting causes the avatar to move into a seated position, so it's already part way there. A few things that would make it even better:

    1. Being able to recenter to a seated rather than standing position.

    2. When seated, having the hips be glued to the seat rather than swaying.

    3. A general reduction or elimination of swaying when seated. When I sat down and remained motionless, the avatar was moving quite a lot.

    4. An option to have the hands on a table or on a controller instead of by the side?

    I'm sure people could do most of this themselves using Final IK (which I've purchased already), but it would be useful to have it as a built in demo like you've done with various other aspects of Final IK.

    When do you expect to release this?
     
  13. Partel-Lang

    Partel-Lang

    Joined:
    Jan 2, 2013
    Posts:
    2,548
    Indeed, VR sure opens new doors for psychologists and neuroscientists, I bet there will be a bunch of papers written any time soon. :) There is a number of phenomenons regarding weight perception and estimation, such as the Charpentier illusion, I wonder how they apply to VR..

    Hand controllers of course are great, but as you said, not many people have them yet, so I'll be concentrating on just the head at this time.

    I have one more cool idea for aiming guns.. If you take off the headset, turn it sideways, cover one of the lenses, and look at the other from a short distance, it's exactly like looking through the scope of a sniper rifle. Even the lens distortion adds to the effect. Makes me want to attach a trigger to the headset. :D

    In a similar fashion, why not use the headset as a steering wheel for a driving game if you don't have one. ;) Its shaped almost like the F1 steering wheel anyway.

    Hi, I can make a sitting down demo, no problem, it's just attaching the body/thigh effectors to the seat.
    Also the hands can be pinned to anything you need as you can see from the interaction demo.
    I don't know yet how long it will take, its a lot of inventing. But I plan to do weapon aiming and movements before it's finished. Then once it's clear the main package won't need any more changes to support VR, I'll pack it up and upload to the Store. VR demos will probably be a separate (free) package because I don't want to add the OVR assets and plugins, they are updated by Oculus.

    Cheers,
    Pärtel
     
  14. Partel-Lang

    Partel-Lang

    Joined:
    Jan 2, 2013
    Posts:
    2,548
    Hi all,

    I got a shooting demo almost ready, have a look. :)
    I went for the "simulator mode", where you aim the gun with your mouse and if you want to take an accurate shot, move your head and close the other eye so you can see down the sight as you would with a real rifle. I think it's more fun, challenging and realistic than just having the gun parented to the head.

    I actually had to configure the dioptre of the gun model for precision. S*** is getting real! :)

    If you are left-handed like myself, hit "H" to switch.

    You can press "R" any time to recenter. You better sit/stand straight while re-centering so you can move your head down a bit to take a shot.

    And of course you can shoot with LMB.

    You can also move around with WASD to see what you hit, but the legs are not moving along yet, I'll address that later.

    Cheers,
    Pärtel
     
  15. Partel-Lang

    Partel-Lang

    Joined:
    Jan 2, 2013
    Posts:
    2,548
    Updated the build.

    Now you can walk around with feet moving, do ratchet rotations with Q/E. I also limited up/down aim angle and added some motion to the gun when walking so it doesn't feel magically locked to the universe. :)

    Cheers,
    Pärtel
     
    Thomas-Pasieka likes this.
  16. Partel-Lang

    Partel-Lang

    Joined:
    Jan 2, 2013
    Posts:
    2,548
    Another update.

    Made some things work better and now you can hold RMB to aim with your head.
     
    Last edited: Jan 30, 2015
    kurylo3d likes this.
  17. kurylo3d

    kurylo3d

    Joined:
    Nov 7, 2009
    Posts:
    1,123
    do u plan on releasing this in anyway for people to use with their occulus vr projects?
     
  18. Partel-Lang

    Partel-Lang

    Joined:
    Jan 2, 2013
    Posts:
    2,548
    Hi, yes it will be a free add-on to Final IK.
     
    knowledgehammer and kurylo3d like this.
  19. Partel-Lang

    Partel-Lang

    Joined:
    Jan 2, 2013
    Posts:
    2,548
    Its done! :)

    You can download the OVR demos from here:

    Unity 4
    Unity 5

    They work with the new Final IK 0.5

    Cheers,
    Pärtel
     
  20. kurylo3d

    kurylo3d

    Joined:
    Nov 7, 2009
    Posts:
    1,123
    @partel ..do you have any problem with people using this for their commercial projects wether freelance or other?
     
  21. LaneFox

    LaneFox

    Joined:
    Jun 29, 2011
    Posts:
    7,462
    I keep meaning to look at this but never can seem to find the time.

    Sub'd for reminders.
     
  22. Partel-Lang

    Partel-Lang

    Joined:
    Jan 2, 2013
    Posts:
    2,548
    No problem, consider this as a part of Final IK, so you can use it as any other Asset Store asset.

    yeah, it's quite a bit of work to set up the Rift for just a quick look. I hope they reduce the number of wires on the consumer version :)
     
  23. kurylo3d

    kurylo3d

    Joined:
    Nov 7, 2009
    Posts:
    1,123
    awesome! thanks.
     
  24. MS80

    MS80

    Joined:
    Mar 7, 2014
    Posts:
    346
    This is just great!
    Tried your last build (shooter) with dk2, I was surprised instantly, this is the real S***!!!
    Final IK is on my wishlist now.

    ps: interaction demo is awesome, too!
     
    Last edited: Feb 28, 2015
    Partel-Lang likes this.
  25. MS80

    MS80

    Joined:
    Mar 7, 2014
    Posts:
    346
    Hi Partel,
    do you think your setup works with navmesh, too?

    I want to do this:
    - OVR player with your Full Body Biped IK (to have a view of his own virtual body while walking, sitting, acting)
    - medium range environment terrain with some ground objects
    - player movement is controlled with navmesh (automatic waypoint system), no gamepad or keyboard, player can only look around, maybe interact with some objects like in your interaction demo, but movement and bodyrotation comes from navmesh
    - automatic acting at predefined positions (opening a door, use phone while walking, sitting down on chair, etc.)

    Is this possible with FinalIK? Maybe there is a example scene with navmesh and FinalIK?
     
    Last edited: Mar 17, 2015
  26. Partel-Lang

    Partel-Lang

    Joined:
    Jan 2, 2013
    Posts:
    2,548
    Hi!

    I can't see why it should be a problem to use a navmesh agent. Instead of moving the character around with input, just parent the entire rig to the agent.

    Automatic acting is more like a philosophical problem with VR since you can't force the player to sit down or move his head (at least not until they add to the headset some electric shock clips for punishment :)). Those actions would have to be reachable without the character having to move his head a lot. You can make him sit down, but it will be a nauseating transition.

    Cheers,
    Pärtel
     
  27. El Maxo

    El Maxo

    Joined:
    May 23, 2013
    Posts:
    177
    Dam this VR and not having the ability to manipulate people. When these mec suits with forced feedback do exist, I will make my Bank Heist "Simulator"
     
    Partel-Lang likes this.
  28. Partel-Lang

    Partel-Lang

    Joined:
    Jan 2, 2013
    Posts:
    2,548
    Hey, just made a little video that demonstrates the head IK:


    Also noticed that picking up the phone interaction didn't work in the Unity5. Just had to disable root motion for the Animators controlling the phone animations, its fixed now.

    Cheers,
    Pärtel
     
    SunnySunshine likes this.
  29. El Maxo

    El Maxo

    Joined:
    May 23, 2013
    Posts:
    177
    That is awsome, would work really well with multiplayer
     
  30. MS80

    MS80

    Joined:
    Mar 7, 2014
    Posts:
    346
    Hey Partel, FinalIK is really amazing!

    I have setup my OVR character with FBBIK + navmesh agent + grounder + effector offset + interaction system, really like how it all works together!
    Got already some interaction objects working, but there is one thing I dont know how to achive:
    I would like to make the character sit down (like in your demo: interaction), but the problem is that the head position is driven by OVRCameraRig (with head effector). So my character sits down and the head stays in place (with streched neck). Do you have any idea how this could be solved? The head should be influeced from the sit down interaction, but the user should still have the freedom to look around, lean forward / back etc?? Something like a postion offset for the OVR Camera Rig?!

    I know those interactions maybe feel nauseating, but if done right it will be ok (played AlienIsolation with the Oculus).
     
    Last edited: Apr 2, 2015
  31. Partel-Lang

    Partel-Lang

    Joined:
    Jan 2, 2013
    Posts:
    2,548
    Hi and thanks!

    To lower the head, you should change the local position of of the game object that has the OVRCameraRig component. Also set OVRPlayerController.usePofileHeight to false or it will overwrite that localPosition.
    If you open the "Basic" demos scene, and click on the game object named "Oculus Setup", you can adjust "Camera Local Position" Y to change the head height (while not playing unless you copy the code in OculusSetup.cs from Awake to Update).

    Cheers,
    Pärtel
     
  32. TheSniperFan

    TheSniperFan

    Joined:
    Jul 18, 2013
    Posts:
    712
    First of all, I really like where this is going. Having a somewhat realistic depiction of your own body will surely help achieving presence.

    However, I think the above video still has a long way to go.
    Before I start, please not that the following criticism is in regard to what @El Maxo said. I'm not referring to how it looks for the player, but how the model looks from the outside.

    The number one gripe I have with it are the body's knees. They're bend almost the entire time. Nobody stands like that for extended periods of time. Especially not when wearing combat gear. When looking straight up or down, your knees aren't bent. The same applies to looking to the sides or leaning a bit. You're standing perfectly straight in those cases.
    The way the character stands at about 0:34 isn't even physically possible. Look at his heels. :eek:

    The second thing I notices is that the feet are pretty much bolted to the ground for the most time. When leaning to the side so much, you'll start bending your knees, you will automatically reposition your feet. Try it out:
    Stand normally, with the feet slightly further apart than should-width. Then lean to the left side until you have to bend your left knee and stay in this position for a few seconds. You'll notice that this is starts getting hard really fast.
    What the character should do when leaning very far to the left, is shifting his left foot away from his right one. Just a few centimeters are enough so you don't have to bend your knees like crazy anymore.

    The third thing are the arms. They look really, really stiff. Even if you don't more your arms voluntarily, they're still subject to gravity. If you lean to the right, your left arm will touch your left side and vice versa. If you lean even further, you will bend your elbow automatically.


    I hope you can get those issues worked out in the future. ;)
     
  33. Partel-Lang

    Partel-Lang

    Joined:
    Jan 2, 2013
    Posts:
    2,548
    Hi, thanks for the feedback!

    The knees are probably bent all the time because I wasn't really standing when making this video, I was just holding the headset in my hand and waving it around a bit lower, just to show how the character is mapped from different angles and positions.

    About the feet being bolted, it is not really IK "jurisdiction", it can and should be solved simply by making the character/animator controller take a side step when the headset goes too far horizontally. The question is, how to set the threshold, with just the head position to work with, it is a lot of guesswork and tweaking.

    The arm stiffness can be easily tweaked by adjusting "Maintain Relative Pos" weight for the hands in the FullBodyBipedIK inspector. Adding a bit of position offset to the hand effectors towards gravity should be no big problem either.

    But in general, about how the models looks from the outside.. After having made those examples and experiments, I tend to think it is best not to show how they look from the outside at all. I mean when the characters stand still, it is OK, but when they start moving, it looks funny, because people don't run with their heads completely fixed along the vertical axis. Looks like they're trying to carry something heavy on their heads. It is a bit distracting even from the first person view, when you see your own shadow. Thats why I think it would be probably best to use simple look at IK that doesn't use positional tracking for the other players, at least for when they are moving.

    Cheers,
    Pärtel
     
  34. greggtwep16

    greggtwep16

    Joined:
    Aug 17, 2012
    Posts:
    1,546
    Hi Partel,

    First off love your asset I've had it for a long time and am starting to play with it again in the VR context.

    I'm looking for some advice on if there is places I should look in your asset to make it noise tolerant for VR. I've made an asset (below in my signature) that turns any android or IOS device into a controller (including the compass, accelerometer, and gyro). In the VR space I want to experiment with turning my old test hardware phones into a poor man's STEM system. I certainly can pass data to your IK asset no problem but sensor data coming from a phone is not perfect for position tracking. Each of the sensor's has it's drawback (gyro good for orientation but a bit slow, and accelerometer is fast but noisy and it's also hard to filter out gravity). At least some of the base stuff I've tried (filtering the raw data and making it better) still leads to problems. I'm certainly going to keep iterating on the sensor data smoothing but figured I might want to also simultaneously ask if there is any places where you think I should look into making the IK more tolerant of noise. Noise tends to come in the form of jitter small quick changes but also sometimes in the form of oops I now know I'm off by a a few inches in hand position or other limbs.

    Accelerometers are really jittery but fairly accurate if you smooth the data. Gyro's are quite smooth but when you look at the data sometimes aren't fully in realtime (latency).

    Any tips?
     
    Last edited: Apr 8, 2015
  35. Partel-Lang

    Partel-Lang

    Joined:
    Jan 2, 2013
    Posts:
    2,548
    Thanks Greggtwep16,

    IK just solves to whatever position you tell it to solve so if there's noise in the input, there will noise in the output.
    Im not really sure what you mean by "making the IK more tolerant of noise"?

    Cheers,
    Pärtel
     
  36. greggtwep16

    greggtwep16

    Joined:
    Aug 17, 2012
    Posts:
    1,546
    I wasn't sure which algorithm(s) you are using so for starters for the fast jittery noise it might be good to tell the IK to solve but to a lesser precision and that lack of precision might help to get rid of some of the noise as well.

    I certainly do realize the main issue is the sensor data (which is in acceleration so its a double integral to get position and noisy). I'm just trying to compensate the best I can and wasn't sure if there were other points I could smooth out the data.
     
  37. Partel-Lang

    Partel-Lang

    Joined:
    Jan 2, 2013
    Posts:
    2,548
    The only ideas I have for you is just to the good old lerping:

    Code (CSharp):
    1. smoothPosition = Vector3.Lerp(smoothPosition, noisyPosition, Time.deltaTime * lerpSpeed);
    or smooth damping:
    Code (CSharp):
    1. smoothPosition = Vector3.SmoothDamp(smoothPosition, noisyPosition, ref v, dampTime);
    or both at the same time. I'm not an expert on noise filters though..

    Cheers,
    Pärtel
     
  38. SDM51

    SDM51

    Joined:
    Sep 13, 2013
    Posts:
    3
    Think I'm about sold on your asset Pärtel, nice demos, thanks. In your last shooter demo though, don't care for the head aiming. When aiming like that in real life your upper body/arms do most of the motion, tend to bring your head along for the ride as all is kinda "locked" together. Thus, aiming down sights would feel much more natural to have it under mouse control I believe. That aside, again nice demo, very nice IK asset.
     
  39. Partel-Lang

    Partel-Lang

    Joined:
    Jan 2, 2013
    Posts:
    2,548
    Thanks! I agree on the head aiming, it's more accurate, but also more boring and doesn't feel right as you said.