LookAt in PolySpatial?

Hey I have this simple script to have an object face the camera, but it doesn’t work in PolySpatial?

using UnityEngine;

public class LookAtTarget : MonoBehaviour
{
private Transform target;

protected virtual void Start()
{
    target = Camera.main.transform;
}

protected virtual void Update()
{
    Vector3 dirToTarget = (target.position - transform.position).normalized;
    transform.LookAt(transform.position - dirToTarget, Vector3.up);
}

}

(not sure wy the markup is like this, didn’t see a code tag but something got applied anyhow)

1 Like

I think you can change the Update to just

protected virtual void Update()
{
    transform.LookAt(target, Vector3.up);
}

As per Unity - Scripting API: Transform.LookAt (unity3d.com) and Unity should do the right thing for you. I haven’t tested this out directly but that would be my first thing to try just to isolate that it’s not the math.

I think the API that takes a vector just needs the world position of the thing you want to look at, so transform.position should work as well.

Thanks, I’m familiar with Lookat, but the problem is: how do I get a hold of the camera /user in VisionOS?
In the play to device mode, there is nothing in the Unity scene that seems to represent the user’s head pose, so there is not transform to look at…
Any idea how to face the user in polyspatial/visionOS?

Ideally/normally, Camera.main would return the user’s headpose…

There is a Mixed Reality scene in the samples that you can import that should provide a good starting point for your purposes. With that, you can use the camera that is hanging off the XR Origin that is expressly designed to track the user’s head. This would be the way to do this for any XR Application in general and not just visionOS. Give that a try and let me know how it goes.

As @joejo suggests, using the Scene Camera from the Mixed Reality template as the LookAt target (and without an AR Session component in the scene) tracks the position of the player’s head in the Unity editor. Walk from the front to the back of the room and the game object faces the new camera position in Unity.

However, in the xcode visionOS simulator the object continues to face the original camera position regardless of player’s position in the room.

Using the AR Session component as the camera (no Scene Camera) Look At rotates the object to an arbitrary point along the negative X axis regardless of player’s position in room in both the Unity Editor as well as visionOS.

Unity 2022.3.13 PS 6.3 Xcode beta 1.3

Is there a fix for this? Facing the camera and in our case picking objects up and putting them down is an important feature most games have.

Hi,

To get head tracking working, you will need a Camera (with a properly setup TrackedPoseDriver component) and an AR Session component in the scene. Ensure that the volume camera has been set to unbounded mode - there’s no head tracking with bounded volume cameras.

I’ve attached a quick pic of how my scene hierarchy was setup. I started with the MixedReality scene, just deleted a couple of things. The cube is set to just LookAt MainCamera every frame.

Edit: See update in comment below. It now works in builds…

It’s so strange. I have the same XR rig (created from GameObject > XR > XR Origin and an ARsession.
But in play-to-device and in a build, the camera is not moving.
When inspecting the scene during play-to-device, there is nothing that moves when I move around in the simulator.
I also attached a test cube to the camera. Which should be head locked. But in the simulator it’s just in the same world position all the time…
What is going on? How to get a camera facing script working like this? :slight_smile:

Settings:

Scene in play-to-device. Observe how there is no camera icon in the scene view that matches with the camera position in the simulator…

Unbound is also active:

v 0.7.1

Ah nevermind. It now works in the build. Not sure what changed…

Hi!

I don’t think head tracking is supported in PlayToDevice right now, so that makes sense. AR features (image tracking, head tracking, etc.) will only work if you build the project, and are currently not expected to work with PlayToDevice.

We had luck with getting a game object to hang off the player’s head tracked camera in visionOS sim.

  1. To face the camera: LookAt needs the right target to function: XROrigin/CameraOffset/Main Camera

  2. To hang a reticle, raycast, or other object off the head tracked camera two additional things must be true:

a. The Camera offset position must be 0,0,0 (otherwise the game object tracks only left right and forward back correctly.) Or Camera Offset, Main Camera, XR Origin etc all at 0,0,0.

b. To be visible the game object must be approximately 1 meter away from the camera (apx Unity z = 0.9). In the visionOS simulator objects close to the camera become transparent.

Update:
c. You may need to add a Volume Camera to your scene. Unity Polyspatial automatically adds one, however we found head pose tracking worked better in simulator if we added one to the heirachy. We copy/pasted a Volume Camera from a sample scene.

2b. Was a head scratch for a while. Hope this helps someone.

3 Likes

does lookat still work? have tried using Scene Camera and also the XR Rig’s Main Camera - but it seems the

does xri eye gaze work?