Hello, I am completely stumped on this.
I am working on a project for with a team for university. In this project we are essentially creating an AR experience for the Pokemon trading card game. It is my responsibility at the moment to get the image tracking working for the Pokemon cards, but I cannot figure it out.
I do not personally have a headset myself so i’ve had tor rely on an XR environment for testing. This has caused a great deal of stress because it appears that nothing is working as I believe it should.
My understanding of an XR simulation is that it provides the actual program the same information it would gain from a real headset (i.e. Pixel data and depth). However this does not seem to be the case.
To begin my testing I am using the basic Unity default simulated environment. I have an AR session and an XR Origin in my scene. In this XR origin I have an AR Tracked Image Manager with a reference to my reference image library which consists of just the Unity Logo. This reference’s name is TestIcon. I also have a prefab in the AR Tracked Image Manager with the same name TestIcon.
No matter how I interact in the scene, the prefab is never instantiated, there existing the exact image in the image library. I have no clue why this is .
I have managed to have the tracking work once before however, the XR environment camera would move around independently of the scene camera, resulting on the prefab not being on the tracked image. Since then, I have not even been able to replicate the instantiated prefab.
Please let me know if my understanding of XR Simulation and AR Foundations is correct, I would like a point in the right direction.
My apologies for not being able to upload images, my account must be too new to do so.
Thanks