Search Unity

Resolved ARKit support for iOS via Unity-ARKit-Plugin

Discussion in 'AR' started by jimmya, Jun 5, 2017.

Thread Status:
Not open for further replies.
  1. fkatsilidis

    fkatsilidis

    Joined:
    Dec 28, 2016
    Posts:
    2
    Sorry if this is a noob question but i'm new to Unity - will the new ARKIT plugin for Unity mean that I can now import a Unity AR scene within my native iOS app? ie call it from within another view controller, exit from the AR scene to a 3rd View controller, etc?
     
  2. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Are you talking about two virtual objects for the cube and plank? You would control them in Unity just you would any other gameobject in a unity scene.
    As for the conference videos, they were not using a marker, but the table was being detected as a plane by ARKit, and they were placing virtual objects like the cup and lamp on that plane.
     
  3. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Unity has native iOS support, so when you build your Unity scene to iOS via Xcode, its creating a native iOS app. You can create as many scenes as you want in unity and switch between them.
     
  4. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
  5. Joyceawt

    Joyceawt

    Joined:
    Feb 23, 2017
    Posts:
    11
    Thank you so much for your reply! So if i were to implement something similar to the conference video Id have to configure something similar where the table is detected as a plane using ARKit? Is this correct? Are there any samples for this?

    I had one more question. Going off of the idea where the table is detected as a plane. If I were to want the user to place a cube onto that plane/table, is it possible to scale it to real size? Meaning, for example, if the user were to actually have the cube object in real life, and I'd like to scale it for the virtual object to be scaled to the cube in the real size?
     
  6. playalong

    playalong

    Joined:
    Mar 9, 2015
    Posts:
    33
    Hi Jimmy, Thanks for posting this - I just tried to build the sample project - vbut just get a balck screen after tyeh unity logo - please see screen grab of xcode - am i doing something wrong? Thanks. Screen Shot 2017-06-07 at 18.36.59.png
     
  7. playalong

    playalong

    Joined:
    Mar 9, 2015
    Posts:
    33
    Needed to set camera privacy - all working now - awesome1
     
  8. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    See the TUTORIAL.txt in the project or this post to see how to setup the plugin to detect planes. Alternatively, using our sample should detect planes, and in our case we display a Unity GameObject (a transparent plane with blue border) for each plane that is detected.

    Also, in our sample app, when you touch the screen, it will do a hit test in ARKit and if it hits a plane, an unshaded cube will appear in that spot.

    For sizing, as mentioned in another post, whenever you create a virtual object in Unity, use 1 unit to correspond to 1 meter and it should be life size.
     
  9. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    There is an easter egg in this project. I was waiting for someone to find it, and it appears someone has :


    If you want to try it out, build and run the UnityParticlePainter scene in the Build Settings. The scene has a button on top that flows between three modes endlessly: Move in World, Pick Color to Paint With, Paint with Color. You can make some cool creations :)
     
    castana1962, anm8tr and ina like this.
  10. christophergoy

    christophergoy

    Unity Technologies

    Joined:
    Sep 16, 2015
    Posts:
    735
    FYI, we have seen a small number of users get a linker error related to :

    Code (CSharp):
    1.  
    2. : "Geo::GetiOSAppDocumentsDir()", referenced from:
    3.  
    If you see this, make sure the Build Setting in Xcode for Dead Code Stripping is set to Yes

    Happy coding!
     
  11. KwahuNashoba

    KwahuNashoba

    Joined:
    Mar 30, 2015
    Posts:
    110
    Hello there,

    At first run, I am really impressed by how good this thing handles SLAM. I put the cube on table, then I wandered through the entire office, even was out, then I returned and the cube was waiting for me :) I just didn't tested how it handles moving up/down the floors.

    There is one thing I would like some input about. Can I merge this with vuforia? I plan to use this for SLAM and use vuforia for markers. Here are some concerns about this, please answer if you can so I would not need to mine through the scripts:
    • Video - I would probably have to stop one of plugins from rendering video feed. Not sure about vuforia but can the video stream be opted out here without any larger consequences?
    • Focus - Someone mentioned here that focus is fixed on 1m and vuforia also has controll of this parameter so if someone found out, in the meantime, how focus distance impacts tracking accuracy, please share it with me
    Regards,

    Danilo
     
  12. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    I have not considered how Vuforia would work with this, but to answer your questions, yes you can disable video rendering in UnityARVideo, but hopefully the video being rendered by vuforia is the exact same as the one used by ARKit to detect features, otherwise your world tracking would be out of sync with video. You might also want to consider using https://developer.apple.com/documentation/vision instead. (Although I have not considered this either, and have no knowledge of how or even if they work together).
     
  13. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    You could save a screenshot using https://docs.unity3d.com/ScriptReference/Application.CaptureScreenshot.html. You will then have to move it to gallery: see plugins listed at bottom of this topic.

    [UPDATE] If you dont need script control you could use this
     
  14. yashkapani

    yashkapani

    Joined:
    Aug 27, 2015
    Posts:
    4
    If i want to use front facing Camera, for tracking is it possible to do that using the ARKit
     
  15. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    This is a question for Apple, but the answer currently is that ARKit only works with rear facing camera.
     
  16. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    If you would like to support in your app both devices that have ARKit support and those that do not, create an ARKitWorldTackingSessionConfiguration just as you would normally, but before using it to RunWithConfig, query IsSupported on that struct. Then you could decide on a path through your game based on the result of that query.
     
    Last edited: Jun 8, 2017
    JoeStrout likes this.
  17. mebalzer

    mebalzer

    Joined:
    Mar 19, 2015
    Posts:
    16
    Big thanks Unity! Here are a few images of tweaking your plug-in and adding cameras to support stereo for "Mixed Reality." The final goal it optimize this for use with the uPLAy "Stealth" seen in the bottom image.


     
    Arvin6, KwahuNashoba and jimmya like this.
  18. robomex

    robomex

    Joined:
    Mar 21, 2016
    Posts:
    2
    Thanks for all the quick replies and help!

    I'm testing bringing in a character controlled by gravity into the UnityARKitScene example, and have changed the Plane Prefab setting on GeneratePlanes to collisionPlanePrefab in hopes that characters/items/models I add to the scene would rest on, for example, my table that ARKit is recognizing as a horizontal plane (instead of using the functionality of UnityARHitTestExample).

    However, I'm unable to get collisions between my character and ARKit planes. Anything I insert into the scene falls into oblivion. Do you have any suggestions or recommendations?
     
    norahx3x likes this.
  19. Fazri

    Fazri

    Joined:
    Feb 21, 2013
    Posts:
    2
    Is there a limitation on the Camera clipping distance, I am noticing object further than 35 m away in real world space clipping out, even tho my camera clipping distance is is set much higher. Is this a limitation or variable I can adjust?
     
  20. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
    really cool to see ColorPicker used in this :) https://github.com/judah4/HSV-Color-Picker-Unity
     
    jimmya likes this.
  21. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    The original prefab has both a collision component and a renderer, while the collision prefab only has collision. You have to add a collider and rigidbody to any object you want to collide with those. See https://docs.unity3d.com/Manual/CollidersOverview.html
     
    norahx3x likes this.
  22. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
  23. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    There shouldn't be though I haven't tested it. We do use a custom projection matrix, so it could be parameters do not take into account the near and far clipping planes provided by the unity camera. Let me check on this.

    [UPDATE: we've found that we had hardcoded the near and far clip planes to .01 and 30. We're going to provide a way to update those values based on your camera. stay tuned! ]

    [UPDATE: this was fixed by introducing a UnityARCameraNearFar helper script that will update the near and far clip planes from the camera it is attached to (on 06/08/2017)]
     
    Last edited: Aug 29, 2017
    4NDDYYYY and Fazri like this.
  24. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Cool! More than we expected, the developer community has really taken this plugin to open up their imaginations and create really awesome stuff. Thanks! Do you have any video of this by chance?
     
  25. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Since there are still some questions about the scenes included in the sample project, I wrote up a SCENES.txt file and submitted it to bitbucket. The following is the text of that file:

    Example Scenes in Project:

    UnityARKitScene.unity: This scene is a "minimal" scene that shows most of the ARKit functionality

    It has a GameObject (GO) called ARCameraManager which has the script UnityARCameraManager.cs, and has a reference to the Main Camera in the scene. On startup, this script initializes the ARKit and updates the camera position, rotation and projectionMatrix based on the information updated per frame by the ARKit.

    The Main Camera has a script on it called UnityARVideo.cs, which updates the live video using a reference to a YUVMaterial which contains a shader to carry out the rendering of the video.

    There is a GO called RandomCube that is the checkerboard cube in the scene that is placed at 1 unit in the z direction from the origin of the scene. Since our tracking begins with the Main Camera position and rotation set at the origin, when you startup the scene you should be able to see a checkered cube 1 meter straight in front of you.

    There is a GO called GeneratePlanes which has UnityARGeneratePlane script which on it which references a prefab to display the generated planes. The script hooks into the horizontal plane detection and update events that ARKit signals so that every new plane detected gets a corresponding instance of the prefab placed in the world. It uses some utility scripts to keep track of the planes detected and to generate the instances. As you scan the scene, this GO should generate an instance of the prefab you have referenced from it each time ARKit detects a plane, and it will update the extents and orientation of the prefab instance based on the plane update events.

    There is a GO called HitCube which has a script called UnityARHitTestExample which references the parent transform. The script does a ARKit hit test whenever a touch is detected on the screen, and the resulting position of the hit test result is used to place the parent transform of the cube. When running the scene, touching on the screen would move the HitCube to where your touch intersected a plane, or if a plane was not hit, the nearest feature point.

    There is a GO called PointCloudParticleExample which has a script of the same name, which gets the point cloud data from ARKit, and displays a particle per point data in the cloud. This shows in the scene as little yellow dots.

    The Directional Light in the scene has a UnityARAmbient script on it, which uses ARKit light estimation value to change the intensity of the light. So if you go into a dark room, the objects in the scene will be lit with a less bright light than if you were in daylight.


    UnityParticlePainter.unity: This scene is a sample that allows you to paint particles into your scene to augment your surroundings.

    The UnityARCameraManager script is setup the same as way as in the minimal scene.
    The main script used to implement the painting functionality is ParticlePainter.cs. The way it works: it has three modes of painting through which you can cycle as many times as you want using the button on the top right. The first mode is "OFF", which allows you to navigate through the scene using the phone so that you can examine your artistic masterpiece. The second mode is "PICK", which brings up a color picker from which you can pick the color of the paint you will use. The third mode is "PAINT", which allows you to move the phone around and cause the app to leave particles of the picked color behind in the world. This will continuously generate particles that as long as you move more than a certain threshold distance. After you have painted, or if you want to start a new section of paint or a new color, press the button again the required number of times to get to the mode you need.

    There is some external source for HSVPicker that was used for this example: https://github.com/judah4/HSV-Color-Picker-Unity
     
  26. KwahuNashoba

    KwahuNashoba

    Joined:
    Mar 30, 2015
    Posts:
    110
    Good point!

    They don't seem to work out of the box. When I put all the components from both systems ARKit works but vuforia doesn't. It looks like the vuforia and UnityARCameraManager are i collision since disabling this component made vuforia work. They might be fighting over render texture or so

    There must be the way to make them work together, just need to dive deeper into scripts I guess.
     
  27. fkatsilidis

    fkatsilidis

    Joined:
    Dec 28, 2016
    Posts:
    2
    Jimmya, thanks for your reply

    but what I am trying to achieve is different. I have an existing iOS app that already has about 50 VCs, including googlemaps, firebase messaging, various multimedia. I want to add a ViewController produced by unity via the Vuforia plugin, ie a user taps a UI element and it brings up the Unity scene (ie the vuforia camera with all the 3d elements/animation i have added) within my existing iOS app. Something that I can segue into and out of as if it was a normal UIView. Is that something I can now achieve?

    What i'm trying to say is starting with the Unity produced iOS app is unfortunately of no use to me as I would have to add everything else on top of that 1 View Controller xcode project Unity produded which would be impossible. Can I do something like this with the ARKIT now?

    Many thanks
    Filippos
     
  28. KwahuNashoba

    KwahuNashoba

    Joined:
    Mar 30, 2015
    Posts:
    110
    I'm afraid that adding everything on top of unity generated project is the only way to achieve what you want to :/ I don't see why this plugin would be any different from any other iOS unmanaged plugin.
     
    fkatsilidis likes this.
  29. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
    Is there a way to do in-editor testing? Similar to Vuforia or HoloLens pseudo-meshroom
     
  30. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
    Will you integrate Core ML ? That will probably help with face tracking a bit especially on file sizes for dlib machine learning library commonly used for face
     
    zeb33 likes this.
  31. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
    Is it possible to customize the yellow dots
     
  32. darthwilson

    darthwilson

    Joined:
    Feb 11, 2013
    Posts:
    11
    Thanks for the post on this Jimmya and well done on all the hard work! On a side note, iOS11 beta lets us screen record so I'll be sure to share videos of what I'm "hoping" to create with ARKit and Unity.
     
    jimmya likes this.
  33. rwcjk

    rwcjk

    Joined:
    Jun 6, 2017
    Posts:
    2
    Hi, thanks for being so helpful with this Jimmya. I'm the noob's noob at development, and trying to learn in my own time. I am struggling to build just the sample checkered cube scene supplied to my iPad Pro 12.9 inch. I have double checked that I have 5.6.1p1, XCode 9 Beta, and running iOS11 on the iPad. When I go to build the project from Unity, Im getting a lot of Deprecations, which is leading me to think I have messed up something in my build settings. Can you see anything here that may be giving me problems?
    Build Settings:
    iOS
    Running in Xcode as Release
    Symlink Unity Libraries (unchecked)
    Development Build (unchecked)
    It feels like I'm asking how to do basic multiplications in an advanced differential equations class here, so I apologize for my noobness. Again thanks for your help on this. Very exciting stuff!
    upload_2017-6-8_10-11-51.png
     
  34. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    This is not something we've looked into at this time, but it might be possible based on some links you could find online. e.g: https://the-nerd.be/2015/08/20/a-better-way-to-integrate-unity3d-within-a-native-ios-application/
     
    fkatsilidis and KwahuNashoba like this.
  35. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Unfortunately, not directly at this time - we do not have access to the ARKit data from the Unity Editor side. I have seen some developers use FPS controls on their camera in play mode in the editor to simulate being in an environment, etc.
     
  36. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Yes indeed. There are two ways to display the dots: PointCloudParticleExample and PointCloudExample. PointCloudParticleExample uses a particle system, and it generates one particle per point in the point cloud. You could go in and alter the ParticlePrefab it references so that the particles look different. PointCloudExample (currently disabled in the scene) uses a PointCloudPrefab (small yellow sphere) that is instantiated in the world for each point. You can alter the PointCloudPrefab to some other mesh.
     
    ina likes this.
  37. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Glad to help in any way I can. I don't see anything obviously wrong with your setup - you may need to add a camera usage description. What are the "deprecations" you are getting? Could you list them?
     
  38. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    I'll let folks know about this request, but again it would be nice if you could feedback.unity3d.com Thanks!
     
  39. rwcjk

    rwcjk

    Joined:
    Jun 6, 2017
    Posts:
    2
    upload_2017-6-8_10-57-39.png
    Here is a sample. There are a lot like this. Some of them stating that they were deprecated as far back as iOS 8.0. I feel like there is something simple I'm missing here.
     
  40. Toolism

    Toolism

    Joined:
    Apr 10, 2014
    Posts:
    12
    I've got everything working and it runs great.
    I've got a question though :
    1. how would one go about using UI components on the AR Camera texture. It seems that currently the camera displays just 3d objects mixed with the AR Video but does not take into account any UI objects also present in the scene.

    I'm asking because I wanted to have a live update of the intensity value so i created a Text in the bottom right corner of the screen and in UnityARAmbient.cs in Update() i was updating the intensity value gotten from ARKit, but it does not show up.

    Here is the code :

    // assume that intensityText has been properly added as a public Text variable up top
    float newai = m_Session.GetARAmbientIntensity();
    intensityText.text = "intensity : "+newai;


    I'm asuming it has to do with the fact that we're basically displaying 1 big render texture that simply ignores UI.
     
  41. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Pushed a fix to bitbucket: we were hard coding the camera clip planes to .01 and 30 instead of reading the values from the unity camera.

    Thanks to user Fazri for noticing this bug.
     
  42. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Ah I think I see it - you have set your target minimum iOS version to 11 - set it back to 8.0 and things should be better.
     
  43. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    I'm guessing there is something wrong with your UI setup, but I can't tell what from your description. If you look in the UnityParticlePainter scene, you should see an example of both IMGUI (the button on top right) as well as uGUI (HSVPicker) that works with the scene.
     
  44. John1515

    John1515

    Joined:
    Nov 29, 2012
    Posts:
    249
    I can´t get UnityARAmbient.cs to work (Ambient Light Estimation).
    I added UnityARAmbient.cs to the demo scene on an empty gameobject and switched off the direct light, but the ambient light level is just as it is from the lighting settings. Can't see the lighting adapt to the environment on an iPhone 7.

    Otherwise really awesome stuff Unity & Apple :) Having experience with Vuforia, I am impressed.
     
  45. Aiursrage2k

    Aiursrage2k

    Joined:
    Nov 1, 2009
    Posts:
    4,835
    Hey guys can you see if you can real world path finding or somehow translate the phones gps to model space. So we can use navmesh to navigate the real world
     
  46. darthwilson

    darthwilson

    Joined:
    Feb 11, 2013
    Posts:
    11
    My first test with ARKit. Loving it so far!

     
    tedgarage and jimmya like this.
  47. Fazri

    Fazri

    Joined:
    Feb 21, 2013
    Posts:
    2

    Awesome thanks for the quick reply!
     
  48. gfxguru

    gfxguru

    Joined:
    Jun 5, 2010
    Posts:
    107
    Very happy to get a free SLAM in unity.
    Jimmya is there any planar marker tracking in ARKit? In many scenario what we need is simple planar image target tracking.
    Apple had bought Metaio few years back and i am assuming ARKit has made from it and it had image tracking.
    Another issue is the Android equivalent, always customer ask for iOS/Android app, so with this only in the iOS side its not complete, hope google will come forward with something better and free :)
     
    ina and Aiursrage2k like this.
  49. leewalf

    leewalf

    Joined:
    Jun 8, 2017
    Posts:
    2
    Build in Xcode, ARKit/ARKit.h file not found.
     
  50. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    You need Xcode 9 beta
     
    shafqat_jamil likes this.
Thread Status:
Not open for further replies.