Search Unity

MYO ARMBAND

Discussion in 'General Discussion' started by Tomnnn, Nov 29, 2015.

  1. Tomnnn

    Tomnnn

    Joined:
    May 23, 2013
    Posts:
    4,148
    EDIT FOR ANY NEW EYES
    There is a video in the last post.
    ===========================

    I searched the unity forums. The search term myo produced 0 results. I found a few unity demos of people using it on youtube. Considering vr needs all of the help it can get to have a shot at becoming a thing, I thought I'd post this in general before it gets moved to the VR forum. WE NEED MORE EYES!

    Now you can wave your arms around! If all else fails (gesture recognition based on EEG reading of le arm), it's still a gyroscope you want wave around like a mad man (or mad woman)!

    It's on sale right now. I'm throwing a chunk of the sixense refund change at it. I'm the always disappointed but always ready for more early adopter of all things VR. Anyone else easily tempted by this stuff?

    What kind of applications would you develop with this technology that aren't better suited for a system like the STEM or something like leap motion? One of the applications I saw was slapping goombas around. It had no problem tracking a fairly quick full arm swipe and unity had no problem using built in physics to whack the goomba. It was set up to have you automatically walk forward while the goombas were on either side of you. It's set up just like one of the games I wanted to make! An infinite runner, but in first person with vr controls! That's probably the first thing I'm going to go for as well after I do the tutorials they have for using gestures and motion controls to move and manipulate a box on a plane in unity.
     
    Last edited: Dec 2, 2015
  2. Tomnnn

    Tomnnn

    Joined:
    May 23, 2013
    Posts:
    4,148
    Aaaaand it came in the mail! That was fast...

    ...and it works. Calibrate it, be deliberate, and it'll work every time. EEG technology is reading me and doing the things I want it to do. I am glad to be born in this era.
     
    theANMATOR2b likes this.
  3. greggtwep16

    greggtwep16

    Joined:
    Aug 17, 2012
    Posts:
    1,546
    Certainly let us know how it is but it doesn't seem good enough for games. 1 MPU (even if it is the sixense 9xxx series) will still drift over time, so it won't be perfectly accurate to put a virtual arm in VR. The eeg gesture recognition is neat and something new, but reading up on it I believe it's sensing the muscle flex in your arm. This is why those 6 gestures are so deliberate. Unfortunately, that probably means its essentially a boolean on/off and can't sense degrees of clenching the fist, or touching fingers, etc.

    At least reading up on all the various attempts, the only two solutions I'm taking seriously at the moment are the oculus touch controllers, and the steam VR controllers. Anything MPU only I can't take seriously currently, there has to be some addition at some point or breakthrough before I think that is viable. The other two use camera's in addition to the MPU's which is why it's more accurate.
     
  4. Tomnnn

    Tomnnn

    Joined:
    May 23, 2013
    Posts:
    4,148
    Whether or not that's the case, there is a diagnostic panel that shows the eeg/flex. You can view the live, raw data. And play with it. So either way, it should be able to recognize more states :D

    Sixense is a different story. It doesn't drift because the base tracks where the other pieces are instead of each unit trying to track where it is based on cumulative motion like every other system does. Either way, I'm trying to get my money back from sixense. I'll get the best solution available in the far future, but for now I just want something I can start with.

    There's a pretty cool video of a guy who got it to recognize a snap gesture - and he used that to unlock a door.

    There's going to be a lot of resynchronization on whatever I make, for sure. Even the "crate smash" demo on their site has two of the gestures reserved for resetting the position of the device and player haha.

    Maybe give sixense another look in the future. It's the one that isn't going to suffer drift or occlusion. It's also the one that probably won't release until 2024.
     
  5. Tomnnn

    Tomnnn

    Joined:
    May 23, 2013
    Posts:
    4,148
    I tried out individual fingers on the diagnostic page. It'll be hard to isolate them because I haven't checked out the sdk yet, but the readings were distinct so it won't be impossible!

    Or I might have one of the built in gestures set to switch it to 'raw' mode that will control something else :D

    --edit

    CONCLUSIONS

    I had to be deliberate because I made the recordings deliberate. I tried a much shorter and relaxed gesture for calibration and the result was faster and more accurate responses because my arm wasn't getting tired :p

    I rested my arm and centered the unity demo then waved it around like a madman for 20 seconds. After returning my arm to its original position, the object in the unity sample was also in its original position. Still may need to sync often but the drift actually isn't bad when you secure the thing on your arm.

    After doing the relaxed calibration, the crate smashing game went more smoothly as well. Making relaxed gestures had them respond almost instantly and keeping an eye on my wrist didn't make me accidentally do reset gestures mid swing with the mace, so it actually ended up working out very nicely. I'll update this with a final bump when I've actually made something. I'll have a video and everything :D
     
    Last edited: Dec 1, 2015
  6. Tomnnn

    Tomnnn

    Joined:
    May 23, 2013
    Posts:
    4,148
    As promised, the final bump.



    Might need to pause to catch those annotations, sorry :p The next post that isn't a reply here will be a new thread about some game being developed. Probably in a more appropriate forum.