I’m creating an AR game where I need to tap over a 3D Object on my scene an shoot to its position. That’s the essentially my problem.
Yes, as I understand from ARFoundation Documentation, you can only Raycast Planes. I mean, currently, you can’t detect other 3D objects in the scene. Please, if I’m wrong, do you have an example in code of how to detect an object by its tag? as it’s on Raycasting out of AR.
Now I’m pretty sure that by now there must be some people requiring to make a Raycast over a 3D object (not a plane). Probably they bend the rules. Do you have an example of it that you may share? What I’m looking for is to Raycasting Detect an object by its tag using ARFoundation.
A simple code of Raycasting over 3D is something like this:
void Update()
{
RaycastHit hit;
if (Physics.Raycast(transform.position, transform.TransformDirection(Vector3.forward), out hit, Mathf.Infinity))
{
Debug.Log("Raycasting Detected: " + hit.collider.tag);
if (hit.collider.tag == "DroneStoreItem")
{
GameObject droneStoreItem = hit.collider.gameObject;
if (droneStoreItem.GetComponent<DroneStoreItem>() != null)
{
droneStoreItem.GetComponent<DroneStoreItem>().DroneStoreBoxTouch();
}
}
}
}
Now, how can I translate previous code using ARSessionOrigin, Pose and a TrackableType (NOT a Plane or Feature Point).
Thanks for your suggestion. But, as I understand, IPointer only detects clicks with the mouse, not tap over an AR Scene. Please, if I’m wrong, do you have an example code of what you are suggesting?
THANK YOU greenmachine. Your suggestion as simple as it was, got right in the center. It worked perfectly. Sometimes we entangle ourselves in difficult solutions when the obvious answer is in front of you. Thank you for giving me a new and simple perspective.
It sounds like you’ve found an answer that suits you, but I was curious about your question:
The AR raycast only raycasts against things that have been detected by the device. Currently, handheld AR (ARCore & ARKit) only detect feature points (points in the point cloud) and planes. Are you trying to use the AR raycast as a general raycaster for (virtual) content in your scene?
Yes. I really think that ARFoundation is awesome, but it needs a lot more documentation and examples. We’re working on several projects with it but we had spent a lot of time trying to understand simple things. I hope you have that in consideration for next upgrades.
Thanks for your support.
When you launched the experimental-AR Interface, you created a lot of different cool examples. What I suggest is to have those same examples but for this official release.
Thank you for the documentation. By now, I almost know it by memory. But nothing like an example.
Hello tonOnwu I really need to know the details how above solution worked for you?As I am new to unity and I am working on a project I need to submit really soon!!Thx for any help beforehand
Ray ray = Camera.main.ScreenPointToRay(touch.position);
raycastManager.Raycast(ray, hits, TrackableType.FeaturePoint);
which is okay, but unnecessary. The Raycast interface accepts screen points too.
I’m not sure what you mean by “wrong results”, but I noticed you aren’t checking whether the raycast actually hits anything, so I imagine it’s probably throwing an IndexOutOfRangeException when you unguardedly access hits[0].
Thanks for the reply. I made it working. The issue is the feature point pose is in local space. I converted it into world space and everything works fine.