Master Image Tracking with ARKit 3 – Part 2

Master Image Tracking with ARKit 3 – Part 2

Intro

Check out the other tutorials that are part of this series:

In Part 1, we initialized our Unity project for Augmented Reality builds, installed AR Foundation and ARKit 3 XR packages, went over the fundamental scene setup for Augmented Reality games, and developed an image tracking application. To get us into hyperdrive, let’s add some logic that will cause our AR object to deactivate when our reference images are no longer being tracked.

The complete Unity project for this tutorial is available here. All images and models used in this tutorial have been created by me.

This tutorial was implemented in:

  • Xcode 10.1
  • Unity 2019.3.0a6 Personal

Tutorial Outline

  1. Add Image Tracking Manager

*Building to Android with ARCore is outside the scope of this tutorial, we encourage you to research how to do so if you would like to.

Add Image Tracking Manager

The logic we want to implement is similar to Vuforia’s image targets. Image Targets represent images that Vuforia Engine can detect and track. Unlike traditional fiducial markers, data matrix codes, and QR codes. Image Targets do not need special black and white regions or codes to be recognized. The Engine detects and tracks the features that are naturally found in the image itself by comparing these natural features against a known target resource database. Once the Image Target is detected, Vuforia Engine will track the image as long as it is at least partially in the camera’s field of view. When it is out of view of the camera the AR interactions are ended.

Unlike Vuforia we’ll need to script to handle the logic for when the image is no longer in view of the camera. In our Script folder, let’s create a new script and a name it Tracked Image Manager. Double click on your new C# Script to open it. The first thing you’ll need to do is import XR AR Subsystem and AR Foundation libraries. You’ll need to pull from these libraries to implement your AR functions.

Next, will need to create the “trackable” manager variable for our image. A trackable is anything that can be detected and tracked in the real world. Planes, point clouds, reference points, environment probe, faces, images, and 3d objects are all examples of trackables. They all have their own trackable manager. In our case will be using the ARTrackedImageManager that detects and tracks the 2D images within our reference library.

To do so will create a variable with the type ARTrackedImageManager and name it m_TrackedImageManager.

This table summarizes the trackable managers and their trackables.

Trackable ManagerTrackablePurpose
ARPlaneManagerARPlaneDetects flat surfaces.
ARPointCloudManagerARPointCloudDetects feature points.
ARReferencePointManagerARReferencePointManages reference points. You can manually add and remove them with ARReferencePointManager.AddReferencePoint and ARReferencePointManager.RemoveReferencePoint.
ARTrackedImageManagerARTrackedImageDetects and tracks 2D images.
AREnvironmentProbeManagerAREnvironmentProbeCreates cubemaps representing the environment.
ARFaceManagerARFaceDetects and tracks human faces.
ARTrackedObjectManagerARTrackedObjectDetects 3D objects

To initialize our manager will need to call the Awake() method. This method is called after all objects are initialized so we can safely speak to other objects or query them using for example GameObject.FindWithTag. To do so we’ll set the Tracked Image Manager variable to Get Component<ARTrackedImageManger>().

When we enabled the trackable manager we want to add the event handler trackedImagesChanged to bind a method OnTrackedImagesChanged that we will create to call once every frame to track information about the 2D images that have changed, i.e., been added, updated, or removed. When we disable the trackable manger we remove that binding. This is used simply to keep our code nice a clean.

With that done we can create the method that is called OnTrackedImagesChanged when binding the event handler. This method properties are the arguments from the change event i.e, added, changed and updated. Trackables can be enumerated via their manager with the trackables member.

We’ll loop through this event argument for each tracked image that has been added to give the initial image a reasonable default scale.

The trackables property returns a TrackableCollection, which can be enumerated in a foreach  statement as in the above example. You can also query for a particular trackable with the TryGetTrackable method. The foreach  statement executes a statement or a block of statements for each element in an instance of the type that implements the System.Collections.IEnumerable or System.Collections.Generic.IEnumerable<T> interface.C

We’ll also loop through the event argument for each tracked image that had been updated to call a method UpdateGameObject that will check if the reference image is being tracked. This method takes the tracked image as a property.

The last thing we want to do in this method is to loop through the event argument for each tracked image that has been removed to destroy the tracked image’s AR object.

When we looped through the event argument for each tracked image that had been updated we called a method UpdateGameObject. As stated before this method will check the tracking state of the tracked image. If the image is currently being tracked by our iOS device we will set the AR object to active. If our iOS device is not tracking the reference image (meaning image has been removed from the view of the camera) deactivate the AR object.

To do this we will declare our new method with the return type void. This simply means that this method will not return value. In our new method will need to create an if else statement that will check if the image-tracked state is compared to “tracking.” If so we’ll set the image-tracked game object to active. Else we deactivate the tracked image game object. There are three tracking states: 1) Tracking 2) Limited 3) None.

When a new trackable is detected, its manager will instantiate a prefab configurable on the manager. The instantiated GameObject must have an ARTrackable component for that type of trackable. If the prefab is null, a GameObject with only the relevant ARTrackable will be created. If your prefab does not have the relevant ARTrackable, one will be added.

The complete Tracked Image Manager script is below:

Boom Bam! Thank you, Ma’am! It’s done. Now if you build and run your program on your iOS device you will find that when the camera of your device tracks the Zenva logo the designated game object appears above it. If you were to move the logo within the physical space your device’s camera tracks the logo as it moves. Just like Vuforia’s image target engine, it does not need special black and white regions or codes to be recognized. The Engine detects and tracks the features that are naturally found in the image itself by comparing these natural features against a known target resource database. Once the Image Target is detected, Vuforia Engine will track the image as long as it is at least partially in the camera’s field of view. When it is out of view of the camera the AR interactions are ended.

Conclusion

In this tutorial we learned to:

  • Initialize Unity project for Augmented Reality builds
  • Install AR Foundation and ARKit XR packages
  • Do fundamental scene setup for Augmented Reality games in Unity
  • Use the Image Reference Library for image tracking
  • Deactivate AR object when image reference is no longer being tracked

And there you have it: a simple image tracking program that you can show friends to impress them beyond belief. We’ve taken one step further into the ARverse, but there’s more to pioneer and explore!

In the future, maybe I’ll show you how to do the same for tracking objects using the ARKit 3 XR and Unity’s AR Foundation Plugin.

“Just remember there is only one corner of the ARverse you can be certain of improving, and that’s your own code.” – Morgan H. McKie