I’ve developed numerous AR apps and I’m about to start a new one. My question is, what is the preferred development preview that allows playing in Editor mode without having to make a build?
I start with iOS development typically, and I found AR Foundation Remote to be indispensible. It works exactly as I would like, but I try to avoid add-ons if at all possible. At some point in development, I need to delete/remove AR Foundation to keep my builds smaller and that’s a bit inconvenient.
I don’t really get Unity MARS . I don’t want a preview on my computer, I want to test it with a device and point the device at my actual floor in my room for testing. I can do this with a build. AR Foundation Remote allows me to do this in the editor without having to make a build. Is AR Foundation still the best option?
Thanks! I got this working and it does seem to be a decent substitute for running on the device. It took me a while to find the ‘Simulator’ dropdown in game view to realize I can simulate various devices. Nice!
I was able to write a script to place an object on a plane -
using System;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARSubsystems;
using EnhancedTouch = UnityEngine.InputSystem.EnhancedTouch;
[RequireComponent(typeof(ARPlaneManager), typeof(ARRaycastManager))]
public class ObjectCreator : MonoBehaviour
{
[SerializeField] private GameObject m_prefab;
private ARRaycastManager _arRaycastManager;
private ARPlaneManager _arPlaneManager;
private List<ARRaycastHit> _hits = new List<ARRaycastHit>();
private void Awake()
{
_arPlaneManager = GetComponent<ARPlaneManager>();
_arRaycastManager = GetComponent<ARRaycastManager>();
}
private void OnEnable()
{
EnhancedTouch.TouchSimulation.Enable();
EnhancedTouch.EnhancedTouchSupport.Enable();
EnhancedTouch.Touch.onFingerDown += FingerDown;
}
private void OnDisable()
{
EnhancedTouch.Touch.onFingerDown -= FingerDown;
EnhancedTouch.TouchSimulation.Disable();
EnhancedTouch.EnhancedTouchSupport.Disable();
}
private void FingerDown(EnhancedTouch.Finger finger)
{
if (finger.index != 0) return;
if (_arRaycastManager.Raycast(finger.currentTouch.screenPosition, _hits, TrackableType.PlaneWithinPolygon))
{
foreach (ARRaycastHit hit in _hits)
{
Pose pose = hit.pose;
GameObject obj = Instantiate(m_prefab, pose.position, pose.rotation);
}
}
}
}
However,I would like to use the XR Interaction Toolkit because it has a nice feature for translating gestures into moving and scaling. I can’t seem to find how to enable TouchSimulation using a built-in Toolkit script. Does anyone have experience on how to set up TouchSimulation using XR Toolkit so that it will work with XR Simulation?
Sorry if it’s obvious, but there seems to be little information on using a combination of all these XR tools together!
In summary, it’s not very easy to use XR Interaction Toolkit (XRI) with XR Simulation currently, but it is possible. XRI 2.5 will include support for XR Simulation. (No release date available for XRI 2.5 but likely coming sometime in Q3.)
Thanks! The template on Github provided in that thread worked (sorta) for me. I can enable touch to place and move objects, but then the right-click camera navigation doesn’t work. So, I can disable touch, rotate and move my camera, and then enable it again to place an object. That should be barely enough for me to start on my project.
I’ll be looking forward to a true fix for this, and of course it would be nice to simulate two-finger gestures like pinch and twirl. I am hoping that a swipe gesture will at least work with this technique.
I’ve been making AR apps for at least 10 years and honestly the development workflow has pretty much always been a nightmare without a working in-editor solution to common tasks like these. AR Foundation Remote works quite well, but it’s a pretty complex installation that I always have to remove at some point, thus losing my ability to troubleshoot in the editor. XRI solves on problem area, but having it work with XR Simulation will help a lot!