With the release of the 6D.ai Reality Platform beta SDK v0.19.1, developers can now customize the ARKit session to enable additional features and integrate them with multiplayer, relocalization, meshing.
This guide and sample code explore two ways ARKit features integrate within the 6D.ai beta SDK API: Light Estimation with Unity, and Plane Detection with SceneKit.
Attention: this guide was written to help developers who are already familiar with the 6D.ai API, ARKit, and either Unity or SceneKit. We recommend new developers start by getting acquainted with our documentation and basic samples first.
The 6D.ai beta SDK for iOS is built using ARKit's tracking and video frames as an input to generate meshes and relocalized poses, which are exposed to AR applications through the API methods of SixDegreesSDK.h
New API methods were added in a separate expert header SixDegreesSDK_advanced.h, and now allow the ARKit session to be configured by the host application with an ARWorldTrackingConfiguration object, and the ARKit session to be accessed, through its ARSession object.
The ARSession object gives access to the latest frame, its pixel buffer, and optionally AR anchors if enabled by the configuration.
In Unity projects, native iOS code and libraries must be placed in Assets/Plugins/iOS. There, you can put an Objective-C script that interacts with the SixDegreesSDK headers and the ARKit API.
In this project, LightEstimation.mm exposes two functions that are accessible from C#:
GetAmbientColorTemp(). They are used by UnityARKitLightManager.cs, which is a component to attach to a GameObject containing a light, using the intensity and color temperature to update that light at every frame.
In the sample scene, there is only one directional light, which will make the rotating 6D.ai logos lighter or darker depending on the real life lighting conditions.
This project requires Unity 2018.3 or above, remember to add you own SixDegreesSDK.plist file when building this project.
Light scene: the virtual objects are illuminated with a brighter directional light.
Dark scene: the virtual objects are less illuminated and appear darker.
In this Objective-C/SceneKit project, a custom ARWorldTrackingConfiguration object is created to enable plane detection, which is otherwise disabled by default. Specifically, detection of horizontal and vertical planes is enabled.
At every frame, a PlaneController class use the ARAnchor list of the session's current frame to find planes and update their geometry in SceneKit. Planes are described in local ARKit coordinates, which are different from the 6D coordinates in case the application relocalized. The object tree of the plane controller addresses that issue: the root node contains all planes, and its transform is the matrix obtained from
SixDegreesSDK_GetARKitTransform(). All children nodes to this root node can have a transform expressed in local ARKit coordinates and still render at the right location in the world.
This sample scene also contains the 6D Mesh, providing an easy way to compare the two methods of physical geometry estimation.
Remember to add you own SixDegreesSDK.plist file when building this project.
Planes from ARKit are rendered in white, while the 6D mesh is rendered in colors.