Intro to Apple AR-Kit
Special thanks to Tod for chasing an iPhone 6 for me, and apologies to him for telling him I needed an iPhone 6 when I needed a 6S.
The source code for the demos in the post are available on my GitHub
Requirements for developing Apple AR-Kit
- iOS device with an A9 or A10 processor and the iOS 11 Beta. This includes the iPhone 6S (emphasis on S), iPhone 7, iPad Pro or iPad (2017).
- Mac OS computer running Mac OS version 10.12.4 or above
- XCode 9 Beta
https://developer.apple.com/documentation/arkit
To begin, I wanted to see how easy it was to get started with running a bare minimum sample AR-Kit app. Fortunately the default template includes a bare bones example that contains a pre-populated model to show off the positional tracking ability.
Document steps for creating new app
Demo 1 - Fixed Model
After deploying to the iPad, I could run the demo:
The tracking is surprisingly stable given the much fewer sensors compared to the Hololens. The model does jump around a little when there are obstructions or rapid movement.
Demo 2 - Hit detection
AR Kit generates an environment map based on information gathered through its camera. While this environment map does not seem to be directly available like the one supplied by the HoloLens, you can perform a raycast hit test to determine the in world position of corresponding to points on the screen. This demo is an updated version of the previous app, and demonstrates creating an object at a real world position. The user touches the screen, and the app projects the position into the world to determine where it collides with a surface, and creates a cube there.
The intersection calculation isn’t perfect, but once again it is surprisingly good for just a single optical camera. As this is still only a beta, I expect this to improve
Demo 3 - Plane Detection
As with the HoloLens, AR Kit provides a mechanism for detecting flat surfaces, and providing them to the program as a plane. This feature will be useful for many apps, and can provide a flat, bounded environment to work with. Currently AR Kit does not have wall plane detection, so can only detect floors and desks. If wall detection is required, a raycast hit test can be be used. As more of the environment is scanned, AR Kit builds a more complete model and recalculates the plane, performing a callback when it does so. As it is unable to detect walls, it may combine areas, even if they are separated by a physical obstacle, such as walls. The following demo displays and updates the planes as they are updated.
At the moment AR Kit is not as advanced as the technology we see in the HoloLens, but so far I haven’t seen anything take full advantage of that technology, so I believe both are fairly evenly matched for the moment. AR Kit experiences are far less immersive as they are on a hand held device, but the cheapest device it can run on is six times cheaper than a HoloLens, and iPhones are quite common, making for a much lower barrier to entry. I think that AR Kit will really help Mixed Reality applications take off, especially if Apple release an equivalent of the Samsung Gear or Google Dream so that the user can be immersed.
Unity Demo
Unity have created a plugin that provides a wrapper around the AR Kit functions so that they can be used in the unity engine. The store plugin provides some example scenes to demonstrate this capability.
https://unity3d.com/learn/tutorials/topics/mobile-touch/building-your-unity-game-ios-device-testing
The Unity build process is similar to the process used for building HoloLens apps. The build in the unity editor creates an XCode project, that you open in XCode, then use that to build the iOS app.