Is there a example in samples of PolyVision which uses World Anchors
Hi there! Yes, the main scene in com.unity.xr.visionos
package samples (select the Apple visionOS XR Provider package and navigate to Samples) includes an ARAnchorManager
and a test script that will place an anchor either where the user’s gaze intersects with a collider, or at the position of the pinch.
Please note that there are currently some issues with world anchors that cause them to fail to be placed immediately. Sometimes you have to “double-click” the tap in quick succession to get an anchor to actually register.
Quick clarification on this. It turns out the World Anchors feature was actually working as intended. The sample was to blame–it created an empty GameObject when placing the anchor, so it only seemed like it wasn’t getting placed properly. Just using ARAnchorManager
and tagging a (visible) GameObject with ARAnchor
should do the trick.
We have tried the samples on the com.unity.xr.visionos
package (the Main
scene inside the folder Assets/Samples/Apple visionOS XR Plugin/0.7.1/VR Sample - URP/Scenes
) but there are several problems:
- The app crashed right after the run → We found that because the scene does not contain the
VolumeCamera
game object → already solved this. - The logic to detect pinch not working, after some workarounds, we decided not to use it and just added the
ARAnchor
to the game object after ARSession initialized. - After the workaround on #1 and #2, we successfully loaded the scene and AR features like plane detection, hand tracking, etc.
- But the next problem is the object was not anchored at all, although we follow the instructions in this link: AR anchor manager | AR Foundation | 4.1.13
After theARAnchor
component was added, we tried to remove the device and wear it again, the application resumed and the object (with the ARAnchor component attached) did not stay at the previous position before removing the device. - One interesting thing to notice, was that the anchor prefab (which is set up in the
ARAnchorMananger
component) worked perfectly. It seems like anytime we attach theARAnchor
component to our object, the anchor prefab is instantiated and becomes permanently persistent, even when we close and reopen the app.
Are you aware of this issue or do you have any successful samples? We tried many ways but seems nothing working for now. We intend to set the position of our object to follow the ARAnchor component, but it seems a very tricky way so we’re not comfortable with it.
Hope you guys have any solutions or at least any explanation to make this clear.
Thank you and have a good day!
Hey there! Happy new year, and sorry to hear you’re having trouble. Let’s see if we can get you sorted
Please note that the samples in com.unity.xr.visionos
are intended to be built for fully immersive VR apps, not MR. This is why the scene does not include a volume camera, and may not function as expected in Mixed Reality with PolySpatial. Out of curiosity, what was the specific behavior you were seeing? It’s surprising to hear that the app would crash, but it may simply start up and not do anything when built for mixed reality.
The MixedReality
sample scene from the package samples in com.unity.polyspatial
is a better starting-point for AR features in Mixed Reality, although it doesn’t include an example of AR Anchors. I’ve created an internal ticket for us to add a specific example for anchors in MR.
It sound like you may have been testing this in a configuration that doesn’t support ARKit features. The pinch detection script depends on ARKit hand tracking, which is my best guess for why it wasn’t working. Was it still not working after you saw plane detection and hand tracking?
Just so we’re on the same page, it sounds like the issue is that AR anchors aren’t correctly updating their position during the session in which they are first placed. On subsequent runs of the app, these anchors do appear where you expect them to, but that same anchor came back in the wrong place if you take off the headset and put it back on without closing the app?
This sounds plausible, although I haven’t seen this particular behavior myself. We are chasing down a number of issues related to Unity apps going into the background and coming back to the foreground. Out of curiostiy, are other AR anchors (planes, meshes, etc.) lining up properly when you resume the app? It’s possible we don’t properly resume the AR session when the app resumes, which would cause all of these features to fail.
Thank you for your response and sorry for the late reply.
Yes, the anchor sometimes worked (the ball appeared at the previous position when I removed the glass) but sometimes didn’t, and it seemed randomly without any clue.
The plane seems anchored very well (rendered by material with white-dot texture) and also the anchor prefab which is referenced in the ARAnchorManager
component. However, the ball was not anchored in half of the cases, which make it difficult to debug.
Never mind this, I figured out that I forgot to call EnhancedTouchSupport.Enable()
in the Start
function.
Ah! The green sphere is not the anchor! That’s attached to the Ray Interactor
that is used for XR Interaction Toolkit, just to show you where we think your hand is. Or it could be the TargetIndicator
which is used by the InputTester
script. It’s also a green sphere (room for improvement there…). These green spheres just happen to show up in the same location where we place anchors, but the actual anchor is the caltrop shaped gray object. This object should show up in the same location when you return to your session.
If I add an AR Anchor component to a “Pivot” gameobject that will be the parent of a certain number of 3d models, will all the models be “anchored” to the correct position even after I stand by the vision and wear it later since the Pivot would technically be anchored to its position?
At the same time, what happens if I set an ARAnchor to a gameObject and to a child of that same gameObject?