AR Foundation Remote | Test and debug your AR project in the Editor

AR Foundation Remote 2.0 is available now on Asset Store!
[Read the blog post]( AR Foundation Remote | Test and debug your AR project in the Editor page-13#post-7419692)

AR Foundation Editor Remote (v1.0)

AR Foundation Editor Remote (v1.0) is not going anywhere and is still an essential AR debugging tool for years to come. Existing customers will receive the same high-quality support as before and can upgrade to v2.0 anytime at the pricess difference between two versions.

Plugin blog posts: ** #1 #2 [#3]( AR Foundation Remote | Test and debug your AR project in the Editor page-2#post-5986280) [#4]( AR Foundation Remote | Test and debug your AR project in the Editor page-2#post-6009191) [#5]( AR Foundation Remote | Test and debug your AR project in the Editor page-2#post-6056636) [#6]( AR Foundation Remote | Test and debug your AR project in the Editor page-3#post-6141561) [#7]( AR Foundation Remote | Test and debug your AR project in the Editor page-5#post-6309288) [#8]( AR Foundation Remote | Test and debug your AR project in the Editor page-5#post-6374664) [#9]( AR Foundation Remote | Test and debug your AR project in the Editor page-6#post-6421008)** [#10]( AR Foundation Remote | Test and debug your AR project in the Editor page-7#post-6525710) [#11]( AR Foundation Remote | Test and debug your AR project in the Editor page-10#post-7046131) [#12]( AR Foundation Remote | Test and debug your AR project in the Editor page-13#post-7419692) [#13]( AR Foundation Remote | Test and debug your AR project in the Editor page-18#post-8194743) [#14 ]( AR Foundation Remote | Test and debug your AR project in the Editor page-19#post-8277219)[#15]( AR Foundation Remote | Test and debug your AR project in the Editor page-20#post-9006808)

Debugging any AR project is a nightmare. Currently, you’re required to wait for a build after any minor code change. This is annoying and unproductive.

:hourglass: Fast iterations are crucial for development. But currently, you’re required to make a new build after any minor change. And builds take a looooong time even for small projects. Now you have the solution!

AR Foundation Editor Remote is an Editor extension that allows you to transmit AR data from AR device to Unity Editor. Run and Debug AR projects right in the Unity Editor!

Current workflow with AR Foundation

  1. Make a change to your AR project.
  2. Build project to a real AR device.
  3. Wait for the build to complete.
  4. Wait.
  5. Wait a little bit more.
  6. Test your app on a real device using only Debug.Log().

Improved workflow with AR Foundation Editor Remote

  1. Setup the AR Companion app once. The setup process takes less than a few minutes.
  2. Make a change to your AR project.
  3. Just press play! Run and debug your AR app with full access to scene hierarchy and all object properties right in the Editor!

:zap: Features :zap:

• Precisely replicates the behavior of a real AR device in Editor.
• Supports all AR Foundation platforms. Extensively tested with ARKit and ARCore.
• Plug-and-play: no additional scene setup is needed, just run your AR scene in Editor with AR Companion running. Extensively tested with scenes from AR Foundation Samples repository.
• Streams video from Editor to real AR device so you can see how your app looks on it without making a build (see Limitations).
• Multi-touch input remoting: test multi-touch input or simulate touch using the mouse in Editor (see Limitations).
• Written in pure C# with no third-party libraries or native code. Adds no performance overhead in production. Full source code is available.
• Connect any AR Device to Windows PC or macOS via Wi-Fi: iOS + Windows PC, Android + macOS… any variation you can imagine!
• Supports wired connection on iOS + macOS.

:zap: Supported AR subsystems :zap:

• Meshing (ARMeshManager): physical environment mesh generation, ARKit mesh classification support.
• Occlusion (AROcclusionManager): ARKit depth/stencil human segmentation, ARKit/ARCore environment occlusion (see Limitations).
• Face Tracking: face mesh, face pose, eye tracking, ARKit Blendshapes.
• Body Tracking: ARKit 2D/3D body tracking, scale estimation.
• Plane Tracking: horizontal and vertical plane detection, boundary vertices, raycast support.
• Image Tracking: supports mutable image library and replacement of image library at runtime.
• Depth Tracking (ARPointCloudManager): feature points, raycast support.
• Camera: camera background video (see Limitations), camera position and rotation, facing direction, camera configurations.
• Camera CPU images: ARCameraManager.TryAcquireLatestCpuImage(), XRCpuImage.Convert(), XRCpuImage.ConvertAsync() (see Limitations).
• Anchors (ARAnchorsManager): add/remove anchors, attach anchors to detected planes.
• Session subsystem: Pause/Resume, receive Tracking State, set Tracking Mode.
• Light Estimation: Average Light Intensity, Brightness, and Color Temperature; Main Light Direction, Color, and Intensity; Exposure Duration and Offset; Ambient Spherical Harmonics.
• Raycast subsystem: raycast against detected planes and cloud points (see Limitations).
• Object Tracking: ARKit object detection after scanning with the scanning app (see Limitations).
• ARKit World Map: full support of ARWorldMap. Serialize current world map, deserialize saved world map, and apply it to the current session.

Requirements

• Unity >= 2019.2.
• AR Device (iPhone with ARKit support, Android with ARCore support, etc.).
• AR Device and Unity Editor should be on the same Wi-Fi network (a wired connection is supported on iOS + macOS).
• AR Foundation >= 3.0.1.

Limitations

• Please check that your AR device supports the AR feature you want to test in Editor. For example, to test Meshing in Editor, your AR device should support Meshing.

• Video streaming and occlusion textures:

  • Are supported only if Editor Graphics API is set to Direct3D11 or Metal.
  • Default resolution scale is 0.33. You can increase the resolution in the plugin’s Settings, but this will result in higher latency and lower framerate.
  • Windows Unity Editor 2019.2: video and occlusion are not supported.

• Raycast subsystem: ARRaycastManager is implemented on top of ARPlaneManager.Raycast() and ARPointCloudManager.Raycast(). Please add ARPlaneManager to your scene to raycast against detected planes and ARPointCloudManager to raycast against detected cloud points.

• Touch input remoting and simulation:

  • Only Input Manager is supported (UnityEngine.Input).
  • Unity 2019.2: please add this line on top of every script that uses UnityEngine.Input:
    using Input = ARFoundationRemote.Input;
  • Unity 2019.2: UI system will not respond to touch events. Please use your mouse to test UI in Editor.

• ARKit Object Tracking:

  • Adding a new object reference library requires a new build of the AR Companion app.
  • Setting arSession.enabled = false after arSession.Reset() will on rare occasions crash the AR Companion app because of this bug.

• Camera CPU images:

  • ARCameraManager.TryAcquireLatestCpuImage() is not synchronized with the latest camera position.
  • Only one XRCpuImage.ConvertAsync() conversion is supported at a time.
  • CPU image conversions produce successful results with delays (after several frames).
  • Occlusion CPU images (TryAcquireHumanStencilCpuImage, TryAcquireHumanDepthCpuImage, TryAcquireEnvironmentDepthCpuImage, TryAcquireEnvironmentDepthConfidenceCpuImage) are NOT supported. As an alternative, you can use Graphics.Blit() to copy textures and access them on CPU (see Texture2DSerializable.cs for example).

Video review from Dilmer Valecillos:

7 Likes

Works well. No hassles in integration. This is an incredible time saver.

AR is particularly difficult to iterate and desperately needs a good, high frequency remote. The update rate of this remote is very high and is suitable for any kind of AR work. I did not see significant latency either.

Real time video playback is a nice to have but totally functional without it.

Hello All,
this asset works like charm and it’s big time saver for me. With this tool you never will need build all the application for testing AR .
I had small trouble with GIT (I didn’t have it ) and plugin needs the GIT for downloading the package.
Support of the author is also great, I had some problem with my code ( raycasting part ) and I got a lot advice on how to solve it.
Radek Hart

Installation was straight forward to get it working on both Android and iOS, but I did come across a few bugs or maybe features that have not yet been implemented.
For instance, I had some mobile screen touch inputs that generate more game objects, these were not working when running, I would tap on my phone screen and nothing would spawn. I however changed them to PC controls (on click) and it worked only in the Unity Editor and not on the phone. Moreover the camera was not being displayed in the Unity editor either, just a black background, it did however show the camera on the phone.
Nevertheless these where minor inconveniences, once I changed the interaction to PC clicks I was able to debug and program much quicker and without hassle, so thank you for that sir!

Huge thanks for the feedback!

I’m sorry for the git part, I didn’t think Unity Package Manager does not already contain it.

Input methods for Editor and mobile are different and you should handle them differently. My plugin manages only a remote part. I’m glad you managed to fix the issue in the end.

UPDATE:
Touch input remoting is already available starting from version 3.5.1:
[https://forum.unity.com/threads/ar-...ject-in-the-editor.898433/page-2#post-5986280]( https://discussions.unity.com/t/792788 page-2#post-5986280)

The camera background is not currently implemented. In my experience, the camera background is not a crucial part of Editor testing.

UPDATE:
Camera background is supported starting from version 3.8.9:
https://discussions.unity.com/t/792788 page-2#post-6056636

1 Like

It seems like supporting the input methods that are already coded into our apps would be important. Is there a way to automatically capture the mobile touch events and spoof the equivalent desktop clicks? Otherwise users of your remote will need to write a bunch of extra code to make their apps interactions testable in the remote.

2 Likes

+1

1 Like

This idea is great! But, unfortunately, there is no simple solution on how to simulate touches in Editor.

  1. Unity already translates Input.GetMouseButtonDown(0), Input.GetMouseButton(0) and Input.GetMouseButtonUp(0) into touch events on mobile. You can use these methods if your app does not handle multiple touches at once.

  2. There is no way to substitute Input.touches with another implementation for Editor. While I can write an InputWrapper class that will translate mouse events to touch gestures, users will be required to replace all usages of Input class with InputWrapper. This is not a great solution because your code will now become dependent on my plugin. In addition to that, this method will still not be able to simulate multitouch.

  3. If your app needs to handle multitouch correctly on all platforms, there is a free TouchScript plugin. It works perfectly and can simulate multitouch in the Editor. If you look into its source code, you’ll understand that touch simulation in Editor is far from an easy task :slight_smile:

To sum it all up, I wanted my plugin to solve one problem and solve it good.
I will think about how to write the InputWrapper class so it has the smallest impact on existing projects.

UPDATE:
Touch input remoting is already available starting from version 3.5.1:
https://discussions.unity.com/t/792788 page-2#post-5986280

Here is the script to handle touches in Editor. It simulates only one-finger gestures.
I’ll add it to the next version of the plugin.

It is the cleanest solution that requires the least amount of changes to your codebase and requires only this line on top of all scripts that uses UnityEngine.Input:
using Input = ARFoundationRemote.Input;

5909273–631013–SimulateTouchWithMouse.cs (4.02 KB)

Hi, What about Face Tracking?

Face tracking is on the roadmap. It seems like it’s the most requested feature.

1 Like

Great, just what I needed – dunno why unity dropped the arRemote in 2019… I am just starting AR for a new project and i think this is essential for the dev/build/test cycle

PLUGIN UPDATE POST #1

Hello again, AR remoters!
The new AR subsystem has arrived!

The whole last week I was developing Face Tracking Remote and here it is in action:

Supported features:

  • face mesh
  • face pose
  • eye tracking

Please tell me what other subsystems you would like to see in the future :slight_smile:

All current users will receive the update with Face Tracking for free.

PLUGIN UPDATE POST #2

3 Likes

Cool. And What about Android?

1 Like

The short answer: Android is supported :slight_smile:

The long answer: the plugin is platform-agnostic so if your Android device supports face tracking, then Face Tracking will work in Unity Editor.

1 Like

This is really awesome. Would it be possible to add the FaceKit Blendshape feature? It’s the only AR feature we really need so if you add it, I’ll definitely buy it.

zulaman, yes, iOS Blendshapes are possible! If everything goes fine, I’ll release them tomorrow. They are already working, I just need to make minor refinements.

1 Like

I’m having trouble with raycasting too. Could you elaborate a bit on your case?
Basically, I’m just trying to cast a ray from the center of the screen to a plane and have something on that location. Nothing major I would assume.

    public GameObject placementIndicator;   
    private ARRaycastManager _arRaycastManager;
    public Camera currentCam;
    private Vector2 screenCenter;

    static List<ARRaycastHit> hits = new List<ARRaycastHit>();

    private void Awake()
    {
        _arRaycastManager = GetComponent<ARRaycastManager>();
    }

    void Start()
    {
        screenCenter = currentCam.ViewportToScreenPoint(new Vector3(0.5f, 0.5f));       
    }

    void Update()
    {
        if (_arRaycastManager.Raycast(screenCenter, hits, TrackableType.Planes)) {
            var hitPose = hits[0].pose;
            placementIndicator.transform.SetPositionAndRotation(hitPose.position, hitPose.rotation);
        }
    }

This works when I build the app in the usual manner, but I can’t seem to receive the raycast back from my device. How would I go about this using this plugin?

Thanks in advance!

@jpvanmuijen , ARRaycastManager is a manager for XRRaycastSubsystem. And XRRaycastSubsystem is not currently implemented in plugin.

You can find MultiTouchRaycastExample.cs included in plugin which shows how to raycast against tracked planes and cloud points with the help of ARPlaneManager.Raycast() and ARPointCloudManager.Raycast().

Please tell me if this will help.

UPDATE:
The plugin already supports ARRaycastManager. Please find the updated example attached.

5936363–648605–MultiTouchRaycastExample.cs (2.96 KB)

Thanks for your quick reply!
I had a look at your script before, but couldn’t quite figure out how to shape it the way I wanted.
But after your post, I tweaked it a bit here and there and now it’s working like a charm. I mainly had to change the return type to a Pose and make the object rotate along with the camera y-axis.
Here’s my code, if anyone is interested. The lockUnlockObject method is to lock the object in its current position via a button.
Suggestions are welcome by the way.

    [SerializeField] bool hitPlanes = true;

    public GameObject objectToPlace;
    public Camera currentCam;

    private ARPlaneManager _arPlaneManager;
    private Vector2 screenCenter;
    private Pose hitPose;
    private bool objectLocked;

    static List<ARRaycastHit> hits = new List<ARRaycastHit>();

    private void Awake()
    {
        _arPlaneManager = GetComponent<ARPlaneManager>();
    }

    void Start()
    {
        screenCenter = currentCam.ViewportToScreenPoint(new Vector3(0.5f, 0.5f));
    }

    void Update()
    {
        var ray = currentCam.ScreenPointToRay(screenCenter);
        if (objectLocked == false)
        {
            var hitPose = tryHitPlanes(ray);
            if (hitPose != null)
            {
                objectToPlace.transform.SetPositionAndRotation(hitPose.position, Quaternion.Euler(0, currentCam.transform.eulerAngles.y, 0));
            }
        }
    }


    Pose tryHitPlanes(Ray ray)
    {
        if (hitPlanes && _arPlaneManager != null)
        {
            using (var hits = _arPlaneManager.Raycast(ray, TrackableType.Planes, Allocator.Temp))
            {
                if (hits.IsCreated && hits.Any())
                {
                    hitPose = hits.First().pose;
                }
            }
        }
        return hitPose;
    }

    public void lockUnlockObject()
    {
        objectLocked = !objectLocked;
    }

Thanks again, this saves me quite a bit of time!

2 Likes