Search Unity

Resolved ARKit support for iOS via Unity-ARKit-Plugin

Discussion in 'AR' started by jimmya, Jun 5, 2017.

Thread Status:
Not open for further replies.
  1. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    We are looking into this.
     
    mimminito likes this.
  2. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    Hey @rockstarsaad,
    I don't think you are misunderstanding anything. Though I wonder if you want to actually tie your gameObject position to the anchor that was detected. I'm not sure what you are trying to accomplish, but more information might allow us to help you more. I am working on the API to manually add/remove anchors now. And as I mentioned earlier, you can already turn off automatic detection through the API we currently provide.
    Cheers,
    Chris
     
  3. Teafela

    Teafela

    Joined:
    Jan 31, 2017
    Posts:
    3
    Is it possible to render the ARKit video feed to a Quad in the world?

    I have tried to assign the material that gets created by the sample code to a plane but this does not seem to work (The quad becomes white). Ive got stereo working for roomscale VR so now I just need the video feed in the right place ;)
     
  4. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    Hey @Teafela,
    You should be able to do this, but changes may be required. As of now we are doing a command buffer blit in the PreRender phase of the camera to render the video feed. I imagine that if you did almost the same thing as the UnityARVideo component, but instead of a command buffer blit, just update the material appropriately, you could get the video to render on a quad. Right now we also handle orientation changes in the shader so your mileage may vary depending on what you want from the YUVShader. When I have some time today I’ll see if I can get it to work and post my results.
    Cheers,
    Chris
     
    4NDDYYYY likes this.
  5. Teafela

    Teafela

    Joined:
    Jan 31, 2017
    Posts:
    3
    Hey @christophergoy

    Cool! Im not too familiar with the command buffer bits yet. Ill also give it another try later today (this stuff really takes a while since you have to build all the way to the phone every time)

    Let me know how your experiments turn out!

    Best,
    Ron
     
  6. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    You do have to build to the phone every time, but you can reduce your build times by doing a "Build and Run", and "Append" to existing Xcode project.
     
  7. Joyceawt

    Joyceawt

    Joined:
    Feb 23, 2017
    Posts:
    11
    i'm trying to test out the possibility of detecting whether a 3d object projected through ARKit is actually there in real life. For example, lets say that I've used ARKit to project a chair. Is it possible to somehow check if there is a corresponding match of a real life chair through the phone's camera? Would using the cloud data perhaps be an option?
     
  8. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    It's theoretically possible, but probably in the realm of "really hard to compute from arbitrary point clouds". Maybe if you could simplify the problem statement somehow: e.g. are all the point cloud data within this area within a threshold of the surface of this cube? Or possibly use some library that can extract meshes from point clouds to compare with?
     
  9. xvart

    xvart

    Joined:
    Dec 12, 2010
    Posts:
    21
    Hey @jimmya

    I have been playing with the plugin and it is very helpful. I already made couple of mini experiences. However, I am seeing an issue with one of my GO. It is a child of HitCubeParent, and has the Unity AR Hit Test Example script with reference set but unlike the others (siblings of the said GO) this one is not tracked and stays in a fixed location with the camera...

    I am not seeing anything weird or missing... any clues?
     
  10. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Is it maybe scaled real large and placed really far away? BTW, if you turn around, does the GO go behind you?
     
  11. xvart

    xvart

    Joined:
    Dec 12, 2010
    Posts:
    21
    No the size is as expected (though it has small scale of 0.0254 now that you mention it). And if I turn back, it comes with me. I am taking out objects/components to see if there is something weird (custom scripts) making it behave badly...
     
  12. xvart

    xvart

    Joined:
    Dec 12, 2010
    Posts:
    21
    @jimmya Now I feel stupid, apparently I fat fingered it into oblivion, and it seemed like it was coming with me because I was limited with the cable length. but bottom line is: "Is it maybe scaled real large and placed really far away?" is the correct tip!
     
  13. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Time to invest in:
    Just kidding - maybe try out the WiFi deploy in Xcode9?
     
    RJproz, KwahuNashoba and xvart like this.
  14. Joyceawt

    Joyceawt

    Joined:
    Feb 23, 2017
    Posts:
    11
    Hi @jimmya! thanks for your response. both are something that i'd be interested in. for the fist method, would i just have to specify the length and widths? would you hve any idea of libraries that could do that?
     
  15. lingoded

    lingoded

    Joined:
    Aug 22, 2015
    Posts:
    8
    Good day @jimmya! Is there a way to extend the distance of detecting cloud point data? eg, as with cameras clipping planes. I saw your update on bitbucket, that removes the fixed matrix length. Does this affect to the distance how far points can be detected? Or does it just affect the draw calls on already placed objects.
     
  16. Arvin6

    Arvin6

    Joined:
    Jun 2, 2017
    Posts:
    27
    Filed a bug report. The number is 921528. Can't attach Xcode project there since the bug reporter was crashing when I try to upload, anyway I didn't change anything in the code on bitbucket. I haven't changed anything from the project yet, it still doesn't work on my iPhone 6s. Please let me know if you require the Xcode project. I have been thinking about working on MR with this plugin but I couldn't run the demo yet. @jimmya It'd be cool if you look into it.

    Xcode Log:
    Code (CSharp):
    1. 2017-06-16 17:55:59.849249+0530 arkitscene[862:199257] [DYMTLInitPlatform] platform initialization successful
    2.  
    3. 2017-06-16 17:56:00.009994+0530 arkitscene[862:199085] -> registered mono modules 0x1010c79d0
    4. 2017-06-16 17:56:00.669139+0530 arkitscene[862:199085] You've implemented -[<UIApplicationDelegate> application:didReceiveRemoteNotification:fetchCompletionHandler:], but you still need to add "remote-notification" to the list of your supported UIBackgroundModes in your Info.plist.
    5. -> applicationDidFinishLaunching()
    6. PlayerConnection initialized from /var/containers/Bundle/Application/63DF8454-0DBD-4BA8-B4E8-529AE58CBFAE/arkitscene.app/Data (debug = 0)
    7. PlayerConnection initialized network socket : 0.0.0.0 55000
    8. Multi-casting "[IP] 192.168.250.31 [Port] 55000 [Flags] 3 [Guid] 3877020045 [EditorId] 2006270493 [Version] 1048832 [Id] iPhonePlayer(iPhone):56000 [Debug] 1" to [225.0.0.222:54997]...
    9. Waiting for connection from host on [0.0.0.0:55000]...
    10. Timed out. Continuing without host connection.
    11. Started listening to [0.0.0.0:55000]
    12. Player data archive not found at `/var/containers/Bundle/Application/63DF8454-0DBD-4BA8-B4E8-529AE58CBFAE/arkitscene.app/Data/data.unity3d`, using local filesystem2017-06-16 17:56:05.825608+0530 arkitscene[862:199085] Metal GPU Frame Capture Enabled
    13. 2017-06-16 17:56:05.827496+0530 arkitscene[862:199085] Metal API Validation Disabled
    14. 2017-06-16 17:56:06.129992+0530 arkitscene[862:199085] libMobileGestalt MobileGestaltSupport.m:153: pid 862 (arkitscene) does not have sandbox access for frZQaeyWLUvLjeuEK43hmg and IS NOT appropriately entitled
    15. 2017-06-16 17:56:06.130251+0530 arkitscene[862:199085] libMobileGestalt MobileGestalt.c:555: no access to InverseDeviceID (see <rdar://problem/11744455>)
    16. -> applicationDidBecomeActive()
    17. GfxDevice: creating device client; threaded=1
    18. Initializing Metal device caps: Apple A8 GPU
    19. Initialize engine version: 5.6.1p1 (74c1f4917542)
    20. =================================================================
    21. Main Thread Checker: UI API called on a background thread: -[UIApplication delegate]
    22. PID: 862, TID: 199292, Thread name: (none), Queue name: NSOperationQueue 0x1c003aa40 :: NSOperation 0x1c4440cc0 (QOS: DEFAULT), QoS: 21
    23. Backtrace:
    24. 4   arkitscene                          0x0000000100111408 UnityCurrentOrientation + 60
    25. 5   arkitscene                          0x00000001000ff08c __UnityCoreMotionStart_block_invoke + 108
    26. 6   Foundation                          0x0000000186ea2de4 <redacted> + 16
    27. 7   Foundation                          0x0000000186de32e0 <redacted> + 96
    28. 8   Foundation                          0x0000000186dd3280 <redacted> + 620
    29. 9   Foundation                          0x0000000186ea574c <redacted> + 228
    30. 10  libdispatch.dylib                   0x00000001a87cc2bc <redacted> + 16
    31. 11  libdispatch.dylib                   0x00000001a87d657c <redacted> + 532
    32. 12  libdispatch.dylib                   0x00000001a87d6fa8 <redacted> + 332
    33. 13  libdispatch.dylib                   0x00000001a87d7b28 <redacted> + 388
    34. 14  libdispatch.dylib                   0x00000001a87e0368 <redacted> + 864
    35. 15  libsystem_pthread.dylib             0x00000001a8a631e8 _pthread_wqthread + 924
    36. 16  libsystem_pthread.dylib             0x00000001a8a62e40 start_wqthread + 4
    37. 2017-06-16 17:56:11.212233+0530 arkitscene[862:199292] [reports] Main Thread Checker: UI API called on a background thread: -[UIApplication delegate]
    38. PID: 862, TID: 199292, Thread name: (none), Queue name: NSOperationQueue 0x1c003aa40 :: NSOperation 0x1c4440cc0 (QOS: DEFAULT), QoS: 21
    39. Backtrace:
    40. 4   arkitscene                          0x0000000100111408 UnityCurrentOrientation + 60
    41. 5   arkitscene                          0x00000001000ff08c __UnityCoreMotionStart_block_invoke + 108
    42. 6   Foundation                          0x0000000186ea2de4 <redacted> + 16
    43. 7   Foundation                          0x0000000186de32e0 <redacted> + 96
    44. 8   Foundation                          0x0000000186dd3280 <redacted> + 620
    45. 9   Foundation                          0x0000000186ea574c <redacted> + 228
    46. 10  libdispatch.dylib                   0x00000001a87cc2bc <redacted> + 16
    47. 11  libdispatch.dylib                   0x00000001a87d657c <redacted> + 532
    48. 12  libdispatch.dylib                   0x00000001a87d6fa8 <redacted> + 332
    49. 13  libdispatch.dylib                   0x00000001a87d7b28 <redacted> + 388
    50. 14  libdispatch.dylib                   0x00000001a87e0368 <redacted> + 864
    51. 15  libsystem_pthread.dylib             0x00000001a8a631e8 _pthread_wqthread + 924
    52. 16  libsystem_pthread.dylib             0x00000001a8a62e40 start_wqthread + 4
    53. 2017-06-16 17:56:12.088372+0530 arkitscene[862:199085] [] <<<< FigVirtualFramebufferServer >>>> FigVirtualFramebufferGetMaxCount: unsupported operation
    54. 2017-06-16 17:56:12.088777+0530 arkitscene[862:199085] [] <<<< FigVirtualFramebufferServer >>>> FigVirtualFramebufferGetFramebufs: unsupported operation
    55. 2017-06-16 17:56:12.088813+0530 arkitscene[862:199085] [] <<<< FigVirtualFramebufferInstallation >>>> FigInstallVirtualDisplay_block_invoke: FigVirtualFramebufferGetFramebufs failed (0 framebufs)
    56. UnloadTime: 5.286750 ms
    57. 2017-06-16 17:56:13.139654+0530 arkitscene[862:199085] [Session] Unable to run the session, configuration is not supported on this device: <ARWorldTrackingSessionConfiguration: 0x1c0283340 planeDetection=Horizontal worldAlignment=Gravity lightEstimation=Enabled>
    58. 2017-06-16 17:56:13.179567+0530 arkitscene[862:199246] [] network_config_register_boringssl_log_debug_updates Failed to register for BoringSSL log debug updates
    59. 2017-06-16 17:56:13.214824+0530 arkitscene[862:199085] AR FAIL
    60. 2017-06-16 17:56:13.291080+0530 arkitscene[862:199246] [] network_config_register_boringssl_log_debug_updates Failed to register for BoringSSL log debug updates
    61. 2017-06-16 17:56:14.210465+0530 arkitscene[862:199290] [] network_config_register_boringssl_log_debug_updates Failed to register for BoringSSL log debug updates
     
  17. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    A
    I see one possible problem right away, it seems to think you do not have a iPhone 6s:
    Initializing Metal device caps: Apple A8 GPU
     
    Arvin6 likes this.
  18. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Cloud point data detection (or feature points as they are known in computer vision) is independent of the virtual (unity) camera specs. It uses computer vision on a set of video frames (mostly noting how pixels move) to figure out where in the real world certain points exist. ARKit does this, and it will be hard for it to extend the range since the things that are far away will have much fewer or no pixels to see move.
     
    lingoded likes this.
  19. Arvin6

    Arvin6

    Joined:
    Jun 2, 2017
    Posts:
    27
    I just noticed that. But I do have an iPhone 6s. What would be the possible solution, why would that happen, I'm confused. I'm quite new to unity and iOS development. Also, my other project with vuforia runs fine on this device. I think its specific to this project on xcode.
     
    Last edited: Jun 16, 2017
  20. jessevan

    jessevan

    Joined:
    Jan 3, 2017
    Posts:
    35
    Has anyone found a way to save and restore a session? The HoloLens and Tango have concepts like world anchors and spaces to allow objects to persist beyond a single application launch. As near as I can tell, ARKit doesn't have an equivalent concept.
     
  21. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    This is not available in ARKit at this time.
    Cheers,
    Chris
     
  22. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    Hi @Arvin6,
    The log tells a different story about your device. If you go into the Settings App, tap on General, and then About, you will see a model number. You can type that into google and see what model your phone is. I'm guessing that it is an iPhone 6 which is not supported by ARKit.
    Cheers,
    Chris
     
  23. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    Hi @Joyceawt,
    Detecting a specific objects from a video stream can be done with Apples Machine Learning API. You would have to train your neural network on the objects that you'd like to detect in order to know that they are within the camera view. This is outside the scope of ARKit. You can find an overview of the API here: https://developer.apple.com/machine-learning/ The parts that might interest you would be the Core ML and Vision APIs.
    Cheers,
    Chris
     
  24. petey

    petey

    Joined:
    May 20, 2009
    Posts:
    1,824
    Hey this is really awesome! Thanks Unity people for getting onto this so quick. :)
    One thing, I had trouble building with AR plus the Post Process Stack. It just gets stuck on one section of the build and hangs.
     
    jimmya likes this.
  25. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Is it getting stuck when you try to build? If so, could you look at the editor log and check if there's something funky going on there. If you can't figure it out, post the log here and we can take a look.
    Having said that, most of the Post Process Stack may be pretty expensive on mobile, so you might want to be judicious in the use of it.
     
  26. petey

    petey

    Joined:
    May 20, 2009
    Posts:
    1,824
    Yeah I was getting carried away :D but the post process seems to work alright on later mobiles.
    I'll try again later and see what the actual error is for you.

    Thanks!
    Pete
     
  27. Joyceawt

    Joyceawt

    Joined:
    Feb 23, 2017
    Posts:
    11
    @christophergoy would u happen to know if I can integrate the machine learning api with ARKit run on unity though? from the docs it seems like it needs to be done on xcode? also, does this mean cloud point data is not an option
     
    Last edited: Jun 17, 2017
  28. FusedVR

    FusedVR

    Joined:
    May 27, 2017
    Posts:
    9
    Thanks for the awesome work @jimmya and Unity! To anyone interested, our team created a tutorial showing how to build a simplified version of the paint demo from scratch:
    . Anyone know of other communities that are great for people learning about ARKit?
     
  29. petey

    petey

    Joined:
    May 20, 2009
    Posts:
    1,824
    Nice one FusedVr!
    Hey since it takes a little bit of movement before the AR finds a plane, I'm trying to think of a good thing to tell the user at that point.
    Anything I think of sounds a bit ridiculous :/
    Any ideas?
     
  30. FusedVR

    FusedVR

    Joined:
    May 27, 2017
    Posts:
    9
    Thank you @petey ! My opinion is apply a translucent gray/white overlay on top of the camera view, then fade that overlay away once you've found enough feature points. Actually, we'll probably do that in an upcoming tutorial now that I think about it! What do you think @jimmya ?
     
  31. EricYen

    EricYen

    Joined:
    Jun 6, 2014
    Posts:
    7
  32. EricYen

    EricYen

    Joined:
    Jun 6, 2014
    Posts:
    7
    so....arkit can not build at ipad air 2 (2017) ????

    i have a same problem , the blue screen...

    xcode 9 beta OK!
    unity 5.6.1 OK!
    ipad air 2 ios 11 OK!


    oh.....no.....
     
  33. Aiursrage2k

    Aiursrage2k

    Joined:
    Nov 1, 2009
    Posts:
    4,835
    Okay im going to try to make the pathfinder today, cause i live close to a mall and always walk around trying to find stores so now ill have a solution!
     
  34. Grislymanor

    Grislymanor

    Joined:
    Aug 15, 2014
    Posts:
    23
    Check out this Labyrinth Demo using Unity ARKit:
     
    Gametyme, petey and jimmya like this.
  35. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    This is great work, Hassan! Your video tutorial is quite clear in explaining all the details. Thank you for supporting this plugin!
     
  36. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Yeah that would work, but I think even better is to encourage the user to move the device around using some sort of object from your experience. E.g. follow that bird - it might lead you to a worm!
     
    JoeStrout likes this.
  37. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Please check the actual model number in the settings on your device. What is the (2017) that you put around the end of your iPad Air 2? From wikipedia, it appears iPad Air 2 was discontinued in 2017, and it has a A8X gpu, which is not supported on ARKit. If your device does actually have a GPU that is A9 or better, please check your console on XCode to see if there is another reason ARKit was not able to start.
     
  38. KwahuNashoba

    KwahuNashoba

    Joined:
    Mar 30, 2015
    Posts:
    110
    Had any difficulties with those darker parts of the room? I noticed some when I was testing the kit.
     
  39. Nedja

    Nedja

    Joined:
    Apr 1, 2014
    Posts:
    3
    Hello. How can I get camera intrinsics parameter? I see that member in API structure ARCamera.
     
  40. jpvanmuijen

    jpvanmuijen

    Joined:
    Aug 23, 2012
    Posts:
    13
    Hi - thanks for creating this plugin, works like a charm!
    I'm taking the RandomCube example as a starting point, and building a world around it's initial position. However, I'd like the user to be able to reposition this world based in his/her current position in the real world, effectively allowing to re-calibrate.
    So I'm trying to figure out what/how virtual content 'moves' when I walk around with my iPad. I assumed it was the CameraParent, but using a button to reset it's position to 0.0.0 does nothing.
    Another way would probably to restart the app via a button, but that seems a bit much.
    Thanks in advance!
     
  41. Grislymanor

    Grislymanor

    Joined:
    Aug 15, 2014
    Posts:
    23
    I believe we noticed some problems - but a workaround might be having some light at the end of the hallway (for something to track. Honestly, our filming location wasn't ideal - we were just excited to see it work.
     
  42. xgraves

    xgraves

    Joined:
    Jun 19, 2017
    Posts:
    1
    Hey, thanks for making this plugin and supporting it here in these forums, that's really awesome. I downloaded and read all the instructions, and it's working great.
     
  43. leewalf

    leewalf

    Joined:
    Jun 8, 2017
    Posts:
    2
    run the demo in the iPad Air 2,Resolution is too low,It's hard to detect planes.but,open system default camera is very clear.
    Is it because the equipment is too old?not support ARkit?
     
  44. zeb33

    zeb33

    Joined:
    Nov 17, 2014
    Posts:
    95
    Am loving the new ARKit! I've attached a navmesh surface to the plane prefab, in theory I thought my agent should then be able to navigate. On the iPad I see the agent on the plane prefab but console still gives "Failed to create agent because there is no valid NavMesh".
    Any ideas, help on runtime navmesh within arkit much appreciated.
     
  45. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Hi Nedja,
    We don't actually send those parameters over from ARKit into our plugin currently. If you really need it. we can try and set it up for you. But this is really a parameter that is used to figure out the camera projection matrix for the virtual camera to match it up with real camera on device.
     
  46. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Since whe
    Since whenever it starts, it sets the camera to 0,0,0, what we should do is to create all the gameobjects under another parent gameobject, and when the camera needs start at a different position, instead move the world (i.e the root gameobject) in the inverse position of the camera.
     
  47. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    yes no support on processor < A9
     
    Last edited: Jun 20, 2017
  48. TheBlackBox

    TheBlackBox

    Joined:
    Jan 13, 2014
    Posts:
    33
    Hi, I've just got into using the ARKit, and I'm curious about some possibilities/restrictions with the kit.

    Would it be possible to convert a plane into a collider? To make room for physics in AR games. If yes, how would this be accomplished?

    Thanks!
    Brandon
     
  49. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Yes, its quite possible, and is already being done with the examples. The "GeneratePlanes" GameObject allows you to specify a prefab. The prefab that is currently specified is the debugPlanesPrefab which contains the mesh you see as well as a collider of the same size. If you look at UnityARBallz example, we use this collider to hold up the balls you generate in the virtual world. These colliders will work with anything else in the scene as described here.

    In addition to this, the ARKit sdk also provides a HitTest API on the planes that it detects (among other things), and we expose the HitTest API in the plugin.
     
  50. lusas

    lusas

    Joined:
    Mar 18, 2015
    Posts:
    6
    @jimmya I was wondering about occlusion and generating vertical planes.

    I noticed that UnityARSessionNativeInterface does not have fully match ARSessionNative.mm and excludes UnityARPlaneDetectionVertical = (1 << 1). What is the reason for that and how could I generate both vertical and horizontal at the same time?
     
Thread Status:
Not open for further replies.