📌 visionOS Template Update

I now am getting just a blank screen which not sure if that’s related to the licensing after I spoke with a support ticket, using the template project
ezgif-2-7883254fbc

I don’t think these issues are related to the template, if would be easier to move this discussion to a new topic / thread.

Does this template not work in simulator ? I get this Linker command failed error while building in Xcode.
Screenshot 2024-02-16 at 1.17.50 PM

It should work in the simulator but you won’t be able to use any of the ARKit features. Can you try building to a clean build folder?

Hi @DanMillerU3D. Is there a sample that works with XR Hands for hands on pickup, drop, etc? Not just indirect touch. I’ve tried building my own taking a Volume Camera and pulling in the xr hand interaction demo, but no hands when running on device. Of course, I could have done a few things wrong along the way. So is there a “blessed” XR Hands on VisionOS sample?

Not exactly no. The template does allow you to do a Direct pinch objects but that still uses the spatial gesture. See SpatialTapSelectFilter.cs. You could limit the interaction to only work on a DirectPinch by modifying this script.

public class SpatialTapSelectFilter : MonoBehaviour, IXRSelectFilter
{

    public bool Process(IXRSelectInteractor interactor, IXRSelectInteractable interactable)
    {

        var activeTouches = Touch.activeTouches;
        if (activeTouches.Count > 0)
        {
            var primaryTouchData = EnhancedSpatialPointerSupport.GetPointerState(activeTouches[0]);

            return primaryTouchData.Kind == SpatialPointerKind.IndirectPinch || primaryTouchData.Kind == SpatialPointerKind.DirectPinch;
        }

        return false;
    }

    public bool canProcess => isActiveAndEnabled;
}

I’d start by checking out the Mixed Reality scene in the Package Samples, there’s a hand visualizer in there that shows all the tracked joints as well as a script to detecting a custom pinch gesture. You could modify that for grabbing objects. At this time we don’t have bindings for XR Interaction toolkit that leverage ARKit hands, just the spatial tap gesture.

OH… IC! Thanks for the info @DanMillerU3D. Have a good weekend!

1 Like

I’m getting started with Apple Vision Pro development and Iv’e gone through the visionOSTemplate. I’m having trouble getting Play to Device to work at all. I’ve only gotten it to work once even when manually entering the ip address of my AVP. It simply doesn’t recognize it at all. The versions match, I’m on the same network.

There are some known issues we’re working through with play to device that should be addressed in the next release.

1 Like

I would like to build an experience where you anchor bounded volumes to the environment in an XR project. Is this possible? If not, do you have any recommendations on how to have multiple objects that can be anchored to the environment but still have other apps (like safari, mail, etc) still open in the environment?

visionOS allows you to place volumes and windows around using the handle on the bottom but the concept of “anchoring” is only available in an immersive space where you get access to ARKit data. The volumes and windows should stay relatively in the same positon but will go away once the device is reset or app is closed.

This requires the app to be in the shared space. Currently we only support the ability to have one volume so you could only position content within the size of that volume. visionOS native APIs (SwiftUI and RealityKit) allow a single app to create multiple windows or volumes so you could achieve this not using Unity.

1 Like

I noticed that the Object Manipulation example in 1.0.3 is not as smooth as it was in earlier versions. It stutters a little bit every time you pick up a block (tested on device). Any idea what might cause this?

edit: ah, the polyspatial samples are not part of the template project but of the package.