Using Native ARKit Object Tracking in Unity

Hello,

Has anyone had success with implementing object tracking in Unity or adding native tracking capability to the VisionOS project built from Unity?

I am working on an application for Vision Pro mainly in Unity using Polyspatial. The application requires me to track objects and make decisions based on tracked object’s location. I was able to create an object tracking application on Native Swift, but could not successfully combine this with my Unity project yet. Each separate project (Main Unity app using Polyspatial and the native app on Swift) can successfully build and be deployed onto VisionPro.

I know that Polyspatial and ARFoundation does not have support for ARKit’s object tracking feature for VIsion Pro as of today, and they only support image tracking inside Unity. For that reason I have been exploring different ways of creating a bridge for two way interaction of the native tracking functionality and the other functionality in Unity.

Below are the methods I tried and failed so far:

  1. Package the tracking functionality as a Swift Plugin and access this in Unity, and then build for Vision Pro: I can create packages and access them for simple exposed variables and methods, but not for outputs and methods from ARKit, which throw dependency errors while trying to make the swift package.
  2. Build project from Unity to VIsion Pro and expose a boolean to start/stop tracking that can be read by the native code, and then carry the tracking classes into the built project. In this approach I keep getting an error that says _TrackingStateChanged cannot be found, which is the class that exposes the bool toggled by the Unity button press:
using System.Runtime.InteropServices;

public class UnityBridge { [DllImport("__Internal")] private static extern void TrackingStateChanged(bool isTracking);

public static void NotifyTrackingState()
{
    // Call the Swift method
    TrackingStateChanged(TrackingStartManager.IsTrackingActive());
}

This seems to be translated to C++ code in the ill2cpp output from Unity, and even though I made sure that all necessary packages were added to the target, I keep receiving this error. from the UnityFramework plugin:

Undefined symbol: _TrackingStateChanged

  1. I have considered extending the current Image Tracking approach in ARFoundation to include object tracking, but that seems to be too complicated for my use case and time frame for now.
  2. The final resort will be to forego Unity implementation and do everything in native code. However, I really want to be able to use Unity’s conveniences and I have very limited experience with Swift development.

We are working towards adding object tracking support for visionOS 2.0 to a future version of PolySpatial.
We hit some technical road blocks in development with the beta mac OS 15 needed to run the Create ML jobs to train on the USDZ files that visionOS 2.0 object tracking requires. Those blockers have been removed so we have moved back to continuing to work on this.

2 Likes

This is great news. Vision OS 2.0 also added an Enterprise camera entitlement that allows raw camera access for developer builds through ARKit. Are there any plans to allow access to that camera access through Polyspatial/AR foundation? If so, how would the integration of the plist or entitlement file to Unity work?

Also, do you have any suggestions on how to integrate object tracking until Polyspatial is updated?

Thank you much.

Glad to hear that. Can you please let me know if there is any tentative ETA for this feature? Or any workaround to get it as a swift plugin to work wiith unity?

1 Like

I spent quite a bit of time trying to put together a plugin that would allow the object tracking to work in Unity, and I believe I got close but eventually let go, since there were other pressing priorities. There are many many authorizations and providers that need to be started before the object tracking can work, and it was sort of a pain to transfer those into Unity successfully.

I got better results by just building the Unity project into Swift, and then trying to integrate object tracking in the native. This second option is definitely not ideal for rapid iteration, but if you already invested a lot in Unity or have plugins that you cannot let go, that is the best option. This is easier compared to making a plugin that works inside Unity natively, but I still was not able to pull it off, but got pretty close. You may start with the sample object tracking project provided by Apple, and adapt the classes that into the export from Unity, of course, easier said than done.

2 Likes

Hello.
I am also struggling with the idea of using Object Tracking in Poly Spatial.
I am interested in how to integrate Object Tracking natively after a Swift build. Can you provide me with more details?

I wanted to follow up on this and see if there have been any updates? I didn’t see anything in the patch notes but was curious if we can expect something soon.

We just released PolySpatial 2.1.2 today which adds Object Tracking support. We included a sample for it, which you can read more about [here].(Unbounded samples | PolySpatial visionOS | 2.1.2).

1 Like

This is great news Peter. We have been waiting for this feature for quite a while, and we appreciate the great work. I am looking forward to test it and integrate it in our project.

One question I have is if the object tracking entitlements such as Object Tracking Parameter Adjustment can be used with Polyspatial or if there is any plan to integrate it. We have found the default tracking frequency to not work for our use case and saw in some other posts that the improved (30 fps instead of 5fps) object tracking performance was much smoother (i.e. tracking 1 object instead of up to 10).

Also, can we use the .referenceobject files that we trained in Create ML for the native tracking also applicable in the Polyspatial application?

Again, thanks for all the work and support.

We currently don’t expose any of the enterprise entitlements. I’ll convey your interest in that particular entitlement.

Yes, you can use your own .referenceobject files trained in Create ML. You should be able to swap out the UnityCube.referenceobject file reference in our Object Tracking sample for any of your own .referenceobject files to quickly experiment with it.

Wonderful! Thank you so much for sharing the response here and for looking into integrating entitlements. It would really be great help if we could use Main Camera Access entitlement and Object Tracking Parameter Adjustment entitlements inside Unity, as we are building everything else for Vision Pro application in Unity with Polyspatial now and have a side prototype just for object tracking where we experiment with these entitlements and at some point we need to merge everything in Unity. This object tracking release will be super helpful, but eventually the tracking without the parameter changes is simply too slow, and we only need to track one object at a time, so the default configuration is not ideal.

Main Camera Access is needed for us to be able to introduce a custom tracking algorithm and semantic identification of objects, if we cannot use the parameter adjustment to improve tracking.

Hello Peter,

I tried the new Polyspatial Object Tracking Project on a new project and ran into some warnings that seems to be related to observing no tracking behavior using play-to-device and vision pro headset. I will appreciate if you may provide recommendations to get the object tracking to work in Unity.

I added below the steps to replicate the issue followed by the error trace.

Steps to Replicate:

  1. Start new project in Unity 6000.0.26f1 using URP
  2. Import the following plugins:

Apple visionOS XR Plugin 2.1.2
Apple ARKit XR Plugin 6.0.3
AR Foundation 6.1.0-pre.3 (after the previous version did not work either)
Polyspatial 2.1.2 (also the samples and play-to-device)
Polyspatial visionOS 2.1.2
Polyspatial XR
XR Plugin Management 4.5.0

  1. Adjusted all settings under XR Plug-in Management accordingly:
    selected Apple vision os for headset, and Polyspatial for editor
    Apple ARKit: requirement: required, facetracking: false
    Apple VisionOS: realitykit with polyspatial
    All errors fixed in Project Validation

  2. Set up play-to-device for the localhost (headset), run play-to-device on headset and run Unity play

  3. The instructions window shows up, however, no tracking prefab is spawned when the referenceobject is shown (this object was tested in native and worked)

Steps to try to fix the issue:

  • Deleted the library folder and restart Unity
  • Installed latest versions of related plugins
  • Tested with different reference objects that worked in native

Below is the error trace for the warnings:

No active UnityEngine.XR.ARSubsystems.XRSessionSubsystem is available. This feature is either not supported on the current platform, or you may need to enable a provider in Project Settings > XR Plug-in Management.
UnityEngine.XR.ARFoundation.ARSession:OnEnable () (at ./Library/PackageCache/com.unity.xr.arfoundation/Runtime/ARFoundation/ARSession.cs:334)

No active UnityEngine.XR.XRInputSubsystem is available. Please ensure that a valid loader configuration exists in the XR project settings.
UnityEngine.XR.ARFoundation.ARInputManager:OnEnable () (at ./Library/PackageCache/com.unity.xr.arfoundation/Runtime/ARFoundation/ARInputManager.cs:25)

No active UnityEngine.XR.ARSubsystems.XRObjectTrackingSubsystem is available. This feature is either not supported on the current platform, or you may need to enable a provider in Project Settings > XR Plug-in Management.
UnityEngine.XR.ARFoundation.ARTrackableManager`5<UnityEngine.XR.ARSubsystems.XRObjectTrackingSubsystem, UnityEngine.XR.ARSubsystems.XRObjectTrackingSubsystemDescriptor, UnityEngine.XR.ARSubsystems.XRObjectTrackingSubsystem/Provider, UnityEngine.XR.ARSubsystems.XRTrackedObject, UnityEngine.XR.ARFoundation.ARTrackedObject>:OnEnable () (at ./Library/PackageCache/com.unity.xr.arfoundation/Runtime/ARFoundation/ARTrackableManager.cs:97)

No active UnityEngine.XR.ARSubsystems.XRCameraSubsystem is available. This feature is either not supported on the current platform, or you may need to enable a provider in Project Settings > XR Plug-in Management.
UnityEngine.XR.ARFoundation.SubsystemLifecycleManager`3<UnityEngine.XR.ARSubsystems.XRCameraSubsystem, UnityEngine.XR.ARSubsystems.XRCameraSubsystemDescriptor, UnityEngine.XR.ARSubsystems.XRCameraSubsystem/Provider>:OnEnable () (at ./Library/PackageCache/com.unity.xr.arfoundation/Runtime/ARFoundation/SubsystemLifecycleManager.cs:59)

Hi Pegassy, We don’t support Object Tracking over Play To Device yet. It is on our roadmap to support it eventually.

Thank you for your response. It helps a lot to know that so I can plan accordingly. In case I did not miss it, it may help other folks to know that in the documentation.

Looking forward to new updates on object tracking and entitlement front, and we appreciate all the work of your team.