Search Unity

Resolved ARKit support for iOS via Unity-ARKit-Plugin

Discussion in 'AR' started by jimmya, Jun 5, 2017.

Thread Status:
Not open for further replies.
  1. sfssddff

    sfssddff

    Joined:
    Nov 15, 2013
    Posts:
    9
    Can we have an augmentation reality subforumm there are like 50 Ar skds

    I’m trying to find one that actually does 3d object tracking
     
  2. apexsoftworks

    apexsoftworks

    Joined:
    Oct 5, 2016
    Posts:
    29
    Maybe I can get some help with this. As I understand it, when ARKit is initialized, an origin position is established. I'm trying to figure out the following questions:

    1. How is the origin position established? In other words, how does it uses the phone to establish an origin? Where in the code does this happen?
    2. Is the main camera moved at all?
    3. How can I manipulate what the origin position is? For example, if I wanted to invert the origin position that the ARKit uses as a reference, how can I do that? Where in the code is this?

    Any help with these questions would be great.

    Thanks.
     
  3. jon_reid

    jon_reid

    Joined:
    Nov 29, 2013
    Posts:
    13
    I've been trying this out and so far its been super easy to get up and running! I'm trying to integrate it into an existing AR app, but some of the functionality requires the AR tracking to be paused while just keeping the camera running (I'm hot swapping between AR markers and markerless because clients) I can currently just pause ARKit, but it stops the camera feed too. Is this possible with the current code?

    Thanks,
     
  4. bantherewind

    bantherewind

    Joined:
    Dec 3, 2015
    Posts:
    4
    Anyone know where I should report bugs? Looks like ARHitTestResult.anchorIdentifier is never populated. I can work around it, but I would prefer to just get the ID when a user taps an anchor. Then I can use the ID to look up the plane (in my local Dictionary of existing anchors) the user tapped and update my scene as the tracking of the tapped plane improves (appropriate behavior for what I'm building).
     
  5. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Sorry about that - that is a typo in ARSessionNative.mm - ARKit does not do vertical plane detection, so our plugin cannot do it either. I will fix this.
     
  6. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Hi - please have a look at ARCameraManager GameObject and UnityARCameraManager.cs on how the plugin updates camera position and rotation based on ARKit camera updates.
     
  7. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    From what I remember, this should work. Maybe you're doing HitTest and you're getting a result from the nearest feature point or estimated plane instead of a plane anchor? See hit test result types.
     
  8. bantherewind

    bantherewind

    Joined:
    Dec 3, 2015
    Posts:
    4
    Ah, yep. I was testing for ARHitTestResultTypeHorizontalPlane when I should have been testing for ARHitTestResultTypeExistingPlane. Thanks!
     
  9. apexsoftworks

    apexsoftworks

    Joined:
    Oct 5, 2016
    Posts:
    29
    It looks like modifying the starting position is locked in the DLL. Oh well.

    If there's any workaround to be able to specify an origin that would be great.
     
  10. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Modifying the starting position in your virtual scene? Let's say you wanted to start at (sx,sy, sz) - What you can do as I noted earlier in this forum is to create your world objects under root GameObject, then when you start, camera is always at 0,0,0 - so you could translate your root GameObject transform by (-sx,-sy, -sz).
     
  11. apexsoftworks

    apexsoftworks

    Joined:
    Oct 5, 2016
    Posts:
    29
    Actually what I am trying to do is modify the starting position in the real world. So instead of starting where the phone currently is facing forward, it starts backwards.
     
  12. apexsoftworks

    apexsoftworks

    Joined:
    Oct 5, 2016
    Posts:
    29
    Maybe there's a workaround by using the method you mentioned. I'll have to do some experimentation.
     
  13. apexsoftworks

    apexsoftworks

    Joined:
    Oct 5, 2016
    Posts:
    29
    Basically, I need to trick ARKit into thinking the phone started facing forward, but actually it started backward.
     
  14. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    It's the same trick - you rotate the root gameobject around the y-axis by 180.
     
  15. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    You are right - there does not appear to be a direct support for this currently. For a workaround, you can keep the session on (or maybe start a lower cost non-world tracking session), and just not respond to world tracking events like camera updates and planeanchor updates. i.e. in our case I would disable the ARCameraManager GameObject and the GeneratePlanes GameObject.
     
  16. RubyKrongDev

    RubyKrongDev

    Joined:
    May 25, 2016
    Posts:
    16
    Hello, this question: how does the position in Unity3d correlate with the GPS coordinates?
     
  17. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    It doesn't. But the units in unity correspond to 1m in real world coordinates, so if you are able to align the unity scene to a gps coordinate, you can then be assured that world tracking will keep you in that alignment.
     
  18. RubyKrongDev

    RubyKrongDev

    Joined:
    May 25, 2016
    Posts:
    16
    Thank you!
     
  19. jkparamlabs

    jkparamlabs

    Joined:
    Jun 21, 2017
    Posts:
    3
    Hello,
    Can you tell us how to integrate ARkit with Cardboard for mixed reality solutions?
    I saw a post earlier of a skeleton model in VR but there weren't any details.
     
  20. jon_reid

    jon_reid

    Joined:
    Nov 29, 2013
    Posts:
    13
    Unfortunately that doesnt quite work. Having any session running at all causes Vuforia to not run correctly. I can turn off vuforia and start ARKit just fine, but it causes some jittering in the cross over which isnt great. I'll keep trying to see if theres something else I'm missing.
     
  21. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Another possibility: disable arkit, and get the camera feed separately via WebCamTexture.
     
  22. jon_reid

    jon_reid

    Joined:
    Nov 29, 2013
    Posts:
    13
    Yeah thats pretty much what I'm doing, that works, it just causes a jump when i re enable arkit as the camera feed from that takes over. ARKits camera takes a few frames before its running so it causes some visual stutter.
     
  23. ronson_wu

    ronson_wu

    Joined:
    Jun 22, 2017
    Posts:
    3
    sorry for my poor english, when i run the unityArKitScene on my iphone7 plus , i can only see the randomcube and all the other virtual objects,while the background is totally green,it seems that the camera can't get the real scene video,how could this be ?thanks.
     
  24. Arvin6

    Arvin6

    Joined:
    Jun 2, 2017
    Posts:
    27
    Did you update your phone to iOS11 beta?
     
  25. Arvin6

    Arvin6

    Joined:
    Jun 2, 2017
    Posts:
    27
    Did anyone manage to load 3d models on runtime? If so, how did you attach UnityARHitTestExample to the game object on runtime? I'm trying to spawn multiple models at runtime and switch between them. Is this possible? I tried copying my script into "plugins/iOS/UnityARKit/" and adding the UnityARHitTestExample component (I modified this script to reference the HitCubeParent through GameObject.Find ). When I run it on phone, I get my 3d model before touching the screen and even after that, it isn't movable at all. Any help will be appreciated.
     
    Last edited: Jun 22, 2017
  26. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Did you change the project in any way? It looks like the YUV shader may not be working in your project for some reason. Do you have UnityARVideo.cs altered in any way? Or maybe you have lost your reference to ClearMaterial?
    You could try getting a clean version of the project, make sure you have all your requirements satisfied, and build and run to the phone again.
     
  27. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    This is not really an ARKit question, more of a Unity question, but I'll give you some pointers: create prefabs of the models you want to make appear. Then on the HitTestExample gameobject add references to the prefabs you want to generate. Then use Instantiate in the script, where we currently have the update of the cubetransform, to create a copy of one of the prefabs at that location.
     
    Arvin6 likes this.
  28. Rickert0_o

    Rickert0_o

    Joined:
    Feb 6, 2014
    Posts:
    11
    I was wondering what the current workflow of everyone is. My workflow is pretty time-consuming at the moment.

    Current workflow:
    1. Edit project in Unity
    2. Build project in Unity
    3. Open build in Xcode
    4. Build project in Xcode
    5. Run build application on ios device
    So many steps for one build.. I might be able to automate this, but maybe there's a better way?

    Another thing that concerns me is debugging my build. I'm currently using Debug.Log() and Log Viewer Asset. I tried Unity Remote but that doesn't show up in Unity and if it does i wonder if it wil work. What do you guys do or use for debugging?
     
  29. User340

    User340

    Joined:
    Feb 28, 2007
    Posts:
    3,001
    Build and Run does exactly this.
     
    Rickert0_o and jimmya like this.
  30. Jamda

    Jamda

    Joined:
    Apr 3, 2013
    Posts:
    6
    Thanks so much for this package! I just converted an old AR project over using the kit and it is so much better than before. This is the future - Apple is in it to win it.
     
    User340 and jimmya like this.
  31. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    As mentioned by Daniel, you can do "Build and Run" instead of the last four steps you have, and it will run directly on the phone without much intervention.

    For debugging, you could put breakpoints in XCode: search for the name of your method in the IL2CPP code:
    e.g. in Bulk_Assembly-CSharp-firstpass_0.cpp, you will find:

    // System.Void UnityARCameraManager::SetCamera(UnityEngine.Camera)

    extern "C" void UnityARCameraManager_SetCamera_m4250956606 (UnityARCameraManager_t2138856896 * __this, Camera_t189460977 * ___newCamera0, const MethodInfo* method)
     
    Rickert0_o likes this.
  32. Rickert0_o

    Rickert0_o

    Joined:
    Feb 6, 2014
    Posts:
    11
    Wow! So obvious. o_O I haven't tried that yet. Thanks!

    Thanks! That makes sense. I'll give it a try!
     
  33. TheTaj

    TheTaj

    Joined:
    Jun 26, 2014
    Posts:
    24
    Hey, so I'm trying to do a simple raycast from the ARCamera to the planes, and I'm having no luck at all.
    The code works fine in the editor, I can drag a planePrefab in front of the camera and it'll register that there's a plane in the raycast, but when I build it it appears that the raycast script is somehow being ignored entirely.
    Code (CSharp):
    1. public GameObject mainCam;
    2.     public GameObject cursor;
    3.  
    4. void Update () {
    5.  
    6.             RaycastHit[] hits;
    7.  
    8.             Ray ray = mainCam.GetComponent<Camera>().ScreenPointToRay(new Vector3(Screen.width / 2f, Screen.height / 2f, 0f));
    9.  
    10.             hits = Physics.RaycastAll (ray, 20000.0F);
    11.  
    12.             //this.gameObject.GetComponent<LineRenderer> ().SetPosition (0, mainCam.transform.position - (mainCam.transform.forward * 100f));
    13.             //this.gameObject.GetComponent<LineRenderer> ().SetPosition (1, mainCam.transform.forward * 100f);
    14.  
    15.  
    16.             for (int i = 0; i < hits.Length; i++) {
    17.                 RaycastHit hit = hits [i];
    18.  
    19.                 if (hit.transform.gameObject.tag == "Plane") {
    20.                     cursor.transform.position = hit.transform.position;
    21.                     break;
    22.                 }
    23.             }
    24.  
    25.     }
    The cursor is a random gameObject, the mainCam is the MainCamera with the usual scripts attached.
     
  34. ezone

    ezone

    Joined:
    Mar 28, 2008
    Posts:
    331
    I would throw a Debug.Log in there to see what's being hit (probably a dumb question, but have you set-up the "Plane" tag on your gameObject?)


    Code (csharp):
    1.  
    2.             for (int i = 0; i < hits.Length; i++) {
    3.                 RaycastHit hit = hits [i];
    4.  
    5.                 Debug.Log("Hit: " + hit.transform.name);
    6.  
    7.                 if (hit.transform.gameObject.tag == "Plane") {
    8.                     cursor.transform.position = hit.transform.position;
    9.                     break;
    10.                 }
    11.             }
    12.  
    13.     }
     
  35. ronson_wu

    ronson_wu

    Joined:
    Jun 22, 2017
    Posts:
    3
    i am sorry,i changed the Graphics API from metel to OpenGLES,once i change it back to metel,everything goes right.
     
  36. ronson_wu

    ronson_wu

    Joined:
    Jun 22, 2017
    Posts:
    3
    yes, i did,now i have it sloved,it was something relate to the Graphics API,thx;
     
  37. TheTaj

    TheTaj

    Joined:
    Jun 26, 2014
    Posts:
    24
    Lol, I'm sure it's tagged. The problem with tossing a Debug.Log in there is that it's working 100% fine in the editor.
    I'll try tinkering with it and having the results fed to a text in the scene for mobile testing, but I'm fairly certain it's going to give me no results
     
  38. gate1

    gate1

    Joined:
    May 31, 2013
    Posts:
    22
    Hi! Thank you for a great plugin! I have a problem with bad plane tracking. Only 20% surface are converted into a plane. Does the plugin support quality settings for tracking or something like this? I use Unity 5.6.2 xcode 9.0 beta 2, siera 10.12.5. Does ARKit require HighSiera beta ? Thank you!
     
  39. Nitrousek

    Nitrousek

    Joined:
    Jan 31, 2016
    Posts:
    38
    Hello, everything's been fine for my project,
    however -
    @jimmya
    The scale's been giving me some trouble. How am I supposed to show whole buildings without scaling them to ridiculous 0.01 at which the physics start messing up/post processing effects don't work well?
     
  40. jon_reid

    jon_reid

    Joined:
    Nov 29, 2013
    Posts:
    13
    Am I right in thinking that the PointData is only from the current frame? I'm trying to analyse the point data but it doesnt seem to be accumulative.

    Looking at the native session, it seems to be taking the point data from the current frame, but apples docs arnt clear whether its all the point data or just the frames point data.
     
  41. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Yes it only shows point data for current frame - there is no historical data from ARKit. You could try and save it off yourself, but you might want to figure out duplicates etc to reduce memory usage.
     
  42. jon_reid

    jon_reid

    Joined:
    Nov 29, 2013
    Posts:
    13
    Yep, was a dumb idea though. The sheer number of point data amassed over just a few seconds is crazy. Run out of memory very quickly.

    I'm trying to see how close I could get to being able to restore a session, or really just place things roughly where they were when you left them. I was thinking I might be able to compare point cloud data to get a rough position, but I think its a non starter.
     
  43. gate1

    gate1

    Joined:
    May 31, 2013
    Posts:
    22
    Guys what's the profit to use cloud points? Now I use only the plane
     
  44. jon_reid

    jon_reid

    Joined:
    Nov 29, 2013
    Posts:
    13
    Using plane anchors is what you'll use most of the time, the point clouds are just the raw frame data and how ARKit determines what a plane is. Planes dont give much identifying information since they adjust over time as you move your device around, so if you want to do something like determine where the user is in comparison to a previous session, you need something like the point cloud data.
     
  45. RubyKrongDev

    RubyKrongDev

    Joined:
    May 25, 2016
    Posts:
    16
    Good afternoon! When the AR is initialized, a cube is placed and the x, y, z axes are set. Can I turn them as though? Rotation
     
    Last edited: Jun 23, 2017
  46. P4blo

    P4blo

    Joined:
    Dec 12, 2012
    Posts:
    18
    Hi Jimmy!
    I was executing this piece of code in order to get the Tracking Status and the Tracking Reason:

    Code (CSharp):
    1. public void ARFrameUpdated(UnityARCamera camera)
    2.     {
    3.         switch (camera.trackingState) {
    4.         case ARTrackingState.ARTrackingStateNormal:
    5.             Debug.Log ("Tracking state is Normal");
    6.             break;
    7.         case ARTrackingState.ARTrackingStateLimited:
    8.             Debug.Log ("Tracking state is Limited: " + camera.trackingReason.ToString());
    9.  
    10.             switch (camera.trackingReason) {
    11.             case ARTrackingStateReason.ARTrackingStateReasonNone:
    12.                 Debug.Log ("Tracking Reason is None");
    13.                 break;
    14.             case ARTrackingStateReason.ARTrackingStateReasonInsufficientFeatures:
    15.                 Debug.Log ("Tracking Reason is Insufficient Features");
    16.                 break;
    17.             case ARTrackingStateReason.ARTrackingStateReasonExcessiveMotion:
    18.                 Debug.Log ("Tracking Reason is Excessive Motion");
    19.                 break;
    20.             }
    21.             break;
    22.         case ARTrackingState.ARTrackingStateNotAvailable:
    23.             Debug.Log ("Tracking state is Not Available");
    24.             break;
    25.         }
    26.     }
    And my question is about the line:
    Code (CSharp):
    1. Debug.Log ("Tracking state is Limited: " + camera.trackingReason.ToString());
    Sometimes I'm getting this output:
    Tracking state is Limited: 3

    I'm receiving the reason 'ARTrackingStateReasonInsufficientFeatures' well:
    Tracking state is Limited: ARTrackingStateReasonInsufficientFeatures

    But, I think that the reason ' ARTrackingStateReasonExcessiveMotion' has some bug, because I can not receive this enum, just a number '3'.

    Any idea?

    Thanks!!
     
  47. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    ARTrackingStateReason mirrors the enum in the ARKit framework. That implies that you might be getting a tracking reason that is not documented in the ARKit framework.
     
    P4blo likes this.
  48. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    I do not understand the question.
     
  49. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    I would debug in Xcode as suggested earlier - put a breakpoint in the cpp code that corresponds to your C# method.
     
  50. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    The requirements are as noted at the top. It does have tracking state that tells you what quality is there. More than likely you will need better lighting where you are trying this.
     
Thread Status:
Not open for further replies.