I have a game where i hunt ghosts using AR camera in android platform, i recently worked on Relay and unity netcode for gameobjects, i wanted to add multiplayer where there can be 2 players helping each other hunting ghosts, currently the relay connection works fine but it seems the location of the AR camera is not synced across clients even though i have addaed “Network transform” to the prefab(testing it on PC to PC seems to work as the scale is synced), is there anything else i need to adjust/edit/Add to make the locations of the AR players synced across clients?
Well it depends upon how you are handling the Unity world space to “real space” synchronization. As long as each client has a player object spawned and each player object has a NetworkTransform then you have the Unity world space synchronization portion handled. Where I have questions is whether you are wanting to know where each player’s Unity camera is relative to other players or where each player’s Android camera is relative to other players.
If you want each player’s AR camera’s view frustum to be visualized on other players, then you will want to get the yaw, pitch, and roll values from each player’s Android device and apply those values to the local client’s player object so that will be synchronized via the NetworkTransform. If you are using a server authoritative motion model (i.e. server controls setting each client’s player’s transform), then you will want to perhaps use like a NetworkVariable with owner write permissions so the local player (client relative) can update that, on the server side it would subscribe to the change notification for that NetworkVariable upon it being updated, and upon updates the server would then make adjustments to the transform. If you are using an owner authoritative motion model, then each client applies the updates to its local player and that is then synchronized with the other clients and/or server.
If none of this helps… then could you provide more details on what you are trying to accomplish and I could help you figure out how to accomplish it.
hello , thanks for replay , right now the game is P2P using relay, and for now i dont even want to sync the found planes by AR , i just want to sync the AR player location , and also maybe each player see the other location.
i am looking now at how i can access the android pitch yaw …etc senors as i am not quite sure how its done.
Just to add to what Stephens said about “real space”. There’s nothing in NGO that will synchronize the relative position of the two devices - the real world devices. They may be one meter apart in real world space, or on the other side of the globe. That’s the position offset you need to figure out, and it’s done outside of Unity networking.
AR technology can provide you with anchors. One way of accomplishing two devices being aware of their “relative offset to each other” is to calibrate them to the same anchor. It’s been 5+ years since I last invested time in those AR scenarios and my knowledge is still quite similar to this article from 6 years ago.
You can use this to start a search as to what today’s technology offers. It may or may not have advanced beyond this point. At most I would expect some of the tech that used to be external paid services to have become integrated in the Google/Apple SDKs so be sure to look in their manuals first.
ok its quite hard for me to understand all of this but i will try , but just to make my idea more clear , i want the host to be the anchor , so he start scanning the area and then clients join to the same scene , so far my current AR setup just scan the ground to create a base , then the walls( i can remove those) so the client is at the same location as the host (maybe little to the side) and most of the movement is rotation only , and if there is position change then its at max 10cm.
to summrize it:
99% of the movement is rotation.
AR only detectes the floor and makes it as a base line , then finding new planes stops.
i want players to see each other.
if there is movement then its at max maybe 10-15 cm.
the game i am doing is just standing in 1 spot and looking around while hunting ghosts coming at you, while seeing through the android camera your room or environment.
I have read about anchors, and I think this is the solution I am looking for, but I am kind of confused about how to implement them.
I want the host to spawn an anchor on its position or place an anchor on the detected plane once tracking started (Automaticlly), and then the clients use that anchor to determine their position in relation to each other. Is that the right approach? How would you do it?
Thanks for the great advice and help, it is much appreciated.
You’d have to have the client synchronize to the same spot (anchor) as the host. This means both devices use the same reference point with a potentially huge margin of error depending on how accurate the player picks the spot, as well as any hardware limitations. So it’s never going to be precise.
I would have both player object’s track the device position relative to the anchor. You can assume the anchor to be the origin of the shared world space. One device may be offset at 1,1,1 and another at -1,1,-1 eg facing each other 1m away from the anchor in all directions.
These offsets would be the player object’s position and continuously updating since the devices are going to move around in space. This tracking offset needs to come from the AR framework.
Now if one player were to touch the screen where the anchor is to spawn a networked object, this object should appear at about the same location close to the anchor for both devices.
Unfortunately, tracking of the anchor deviates over time and with movement of the device so the more and faster the devices move (plus light conditions etc) the more deviation you will notice.
it seems i hit a diffrenet road block , when i start the host then let a client join i get the following error , and the first device that spawns controls both players:
An input device ARFoundationRemoteInputDevice with the TrackedDevice characteristic was registered but the ARPoseDriver is already consuming data from ARFoundationRemoteInputDevice.
UnityEngine.Object:Instantiate<UnityEngine.GameObject> (UnityEngine.GameObject)
Unity.Netcode.NetworkSpawnManager:GetNetworkObjectToSpawn (uint,ulong,UnityEngine.Vector3,UnityEngine.Quaternion,bool) (at ./Library/PackageCache/com.unity.netcode.gameobjects@1.11.0/Runtime/Spawning/NetworkSpawnManager.cs:478)
Unity.Netcode.NetworkSpawnManager:CreateLocalNetworkObject (Unity.Netcode.NetworkObject/SceneObject) (at ./Library/PackageCache/com.unity.netcode.gameobjects@1.11.0/Runtime/Spawning/NetworkSpawnManager.cs:507)
Unity.Netcode.NetworkConnectionManager:HandleConnectionApproval (ulong,Unity.Netcode.NetworkManager/ConnectionApprovalResponse) (at ./Library/PackageCache/com.unity.netcode.gameobjects@1.11.0/Runtime/Connection/NetworkConnectionManager.cs:762)
Unity.Netcode.ConnectionRequestMessage:Handle (Unity.Netcode.NetworkContext&) (at ./Library/PackageCache/com.unity.netcode.gameobjects@1.11.0/Runtime/Messaging/Messages/ConnectionRequestMessage.cs:156)
Unity.Netcode.NetworkMessageManager:ReceiveMessage<Unity.Netcode.ConnectionRequestMessage> (Unity.Netcode.FastBufferReader,Unity.Netcode.NetworkContext&,Unity.Netcode.NetworkMessageManager) (at ./Library/PackageCache/com.unity.netcode.gameobjects@1.11.0/Runtime/Messaging/NetworkMessageManager.cs:582)
Unity.Netcode.NetworkMessageManager:HandleMessage (Unity.Netcode.NetworkMessageHeader&,Unity.Netcode.FastBufferReader,ulong,single,int) (at ./Library/PackageCache/com.unity.netcode.gameobjects@1.11.0/Runtime/Messaging/NetworkMessageManager.cs:446)
Unity.Netcode.NetworkMessageManager:ProcessIncomingMessageQueue () (at ./Library/PackageCache/com.unity.netcode.gameobjects@1.11.0/Runtime/Messaging/NetworkMessageManager.cs:472)
Unity.Netcode.NetworkManager:NetworkUpdate (Unity.Netcode.NetworkUpdateStage) (at ./Library/PackageCache/com.unity.netcode.gameobjects@1.11.0/Runtime/Core/NetworkManager.cs:49)
Unity.Netcode.NetworkUpdateLoop:RunNetworkUpdateStage (Unity.Netcode.NetworkUpdateStage) (at ./Library/PackageCache/com.unity.netcode.gameobjects@1.11.0/Runtime/Core/NetworkUpdateLoop.cs:192)
Unity.Netcode.NetworkUpdateLoop/NetworkEarlyUpdate/<>c:<CreateLoopSystem>b__0_0 () (at ./Library/PackageCache/com.unity.netcode.gameobjects@1.11.0/Runtime/Core/NetworkUpdateLoop.cs:215)
Edit: i managed to fix the error , but still the first player connects controls all cameras
here again , still facing issue of the host controlling all players prefab, is there a way to edit the tracked pose driver to make each player control the prefab spawned?
If you are using ARCore then there are ARCore Cloud Anchors which you can use to share an anchor between devices, however those are static locations - for a dynamic location you’d need to share the device offset from the anchor via netcode or some other method.
We are actually working on support for Cloud Anchors in AR Foundation.
As for tracked pose driver, that might not be what you want but I think I need to spend some more time understanding what you are doing.