AR Foundation Remote 2.0 is available now on Asset Store!
[Read the blog post]( AR Foundation Remote | Test and debug your AR project in the Editor page-13#post-7419692)
AR Foundation Editor Remote (v1.0)
AR Foundation Editor Remote (v1.0) is not going anywhere and is still an essential AR debugging tool for years to come. Existing customers will receive the same high-quality support as before and can upgrade to v2.0 anytime at the pricess difference between two versions.
Plugin blog posts: ** #1 #2 [#3]( AR Foundation Remote | Test and debug your AR project in the Editor page-2#post-5986280) [#4]( AR Foundation Remote | Test and debug your AR project in the Editor page-2#post-6009191) [#5]( AR Foundation Remote | Test and debug your AR project in the Editor page-2#post-6056636) [#6]( AR Foundation Remote | Test and debug your AR project in the Editor page-3#post-6141561) [#7]( AR Foundation Remote | Test and debug your AR project in the Editor page-5#post-6309288) [#8]( AR Foundation Remote | Test and debug your AR project in the Editor page-5#post-6374664) [#9]( AR Foundation Remote | Test and debug your AR project in the Editor page-6#post-6421008)** [#10]( AR Foundation Remote | Test and debug your AR project in the Editor page-7#post-6525710) [#11]( AR Foundation Remote | Test and debug your AR project in the Editor page-10#post-7046131) [#12]( AR Foundation Remote | Test and debug your AR project in the Editor page-13#post-7419692) [#13]( AR Foundation Remote | Test and debug your AR project in the Editor page-18#post-8194743) [#14 ]( AR Foundation Remote | Test and debug your AR project in the Editor page-19#post-8277219)[#15]( AR Foundation Remote | Test and debug your AR project in the Editor page-20#post-9006808)
Debugging any AR project is a nightmare. Currently, you’re required to wait for a build after any minor code change. This is annoying and unproductive.
Fast iterations are crucial for development. But currently, you’re required to make a new build after any minor change. And builds take a looooong time even for small projects. Now you have the solution!
AR Foundation Editor Remote is an Editor extension that allows you to transmit AR data from AR device to Unity Editor. Run and Debug AR projects right in the Unity Editor!
Current workflow with AR Foundation
- Make a change to your AR project.
- Build project to a real AR device.
- Wait for the build to complete.
- Wait.
- Wait a little bit more.
- Test your app on a real device using only Debug.Log().
Improved workflow with AR Foundation Editor Remote
- Setup the AR Companion app once. The setup process takes less than a few minutes.
- Make a change to your AR project.
- Just press play! Run and debug your AR app with full access to scene hierarchy and all object properties right in the Editor!
Features
• Precisely replicates the behavior of a real AR device in Editor.
• Supports all AR Foundation platforms. Extensively tested with ARKit and ARCore.
• Plug-and-play: no additional scene setup is needed, just run your AR scene in Editor with AR Companion running. Extensively tested with scenes from AR Foundation Samples repository.
• Streams video from Editor to real AR device so you can see how your app looks on it without making a build (see Limitations).
• Multi-touch input remoting: test multi-touch input or simulate touch using the mouse in Editor (see Limitations).
• Written in pure C# with no third-party libraries or native code. Adds no performance overhead in production. Full source code is available.
• Connect any AR Device to Windows PC or macOS via Wi-Fi: iOS + Windows PC, Android + macOS… any variation you can imagine!
• Supports wired connection on iOS + macOS.
Supported AR subsystems
• Meshing (ARMeshManager): physical environment mesh generation, ARKit mesh classification support.
• Occlusion (AROcclusionManager): ARKit depth/stencil human segmentation, ARKit/ARCore environment occlusion (see Limitations).
• Face Tracking: face mesh, face pose, eye tracking, ARKit Blendshapes.
• Body Tracking: ARKit 2D/3D body tracking, scale estimation.
• Plane Tracking: horizontal and vertical plane detection, boundary vertices, raycast support.
• Image Tracking: supports mutable image library and replacement of image library at runtime.
• Depth Tracking (ARPointCloudManager): feature points, raycast support.
• Camera: camera background video (see Limitations), camera position and rotation, facing direction, camera configurations.
• Camera CPU images: ARCameraManager.TryAcquireLatestCpuImage(), XRCpuImage.Convert(), XRCpuImage.ConvertAsync() (see Limitations).
• Anchors (ARAnchorsManager): add/remove anchors, attach anchors to detected planes.
• Session subsystem: Pause/Resume, receive Tracking State, set Tracking Mode.
• Light Estimation: Average Light Intensity, Brightness, and Color Temperature; Main Light Direction, Color, and Intensity; Exposure Duration and Offset; Ambient Spherical Harmonics.
• Raycast subsystem: raycast against detected planes and cloud points (see Limitations).
• Object Tracking: ARKit object detection after scanning with the scanning app (see Limitations).
• ARKit World Map: full support of ARWorldMap. Serialize current world map, deserialize saved world map, and apply it to the current session.
Requirements
• Unity >= 2019.2.
• AR Device (iPhone with ARKit support, Android with ARCore support, etc.).
• AR Device and Unity Editor should be on the same Wi-Fi network (a wired connection is supported on iOS + macOS).
• AR Foundation >= 3.0.1.
Limitations
• Please check that your AR device supports the AR feature you want to test in Editor. For example, to test Meshing in Editor, your AR device should support Meshing.
• Video streaming and occlusion textures:
- Are supported only if Editor Graphics API is set to Direct3D11 or Metal.
- Default resolution scale is 0.33. You can increase the resolution in the plugin’s Settings, but this will result in higher latency and lower framerate.
- Windows Unity Editor 2019.2: video and occlusion are not supported.
• Raycast subsystem: ARRaycastManager is implemented on top of ARPlaneManager.Raycast() and ARPointCloudManager.Raycast(). Please add ARPlaneManager to your scene to raycast against detected planes and ARPointCloudManager to raycast against detected cloud points.
• Touch input remoting and simulation:
- Only Input Manager is supported (UnityEngine.Input).
- Unity 2019.2: please add this line on top of every script that uses UnityEngine.Input:
using Input = ARFoundationRemote.Input; - Unity 2019.2: UI system will not respond to touch events. Please use your mouse to test UI in Editor.
• ARKit Object Tracking:
- Adding a new object reference library requires a new build of the AR Companion app.
- Setting arSession.enabled = false after arSession.Reset() will on rare occasions crash the AR Companion app because of this bug.
• Camera CPU images:
- ARCameraManager.TryAcquireLatestCpuImage() is not synchronized with the latest camera position.
- Only one XRCpuImage.ConvertAsync() conversion is supported at a time.
- CPU image conversions produce successful results with delays (after several frames).
- Occlusion CPU images (TryAcquireHumanStencilCpuImage, TryAcquireHumanDepthCpuImage, TryAcquireEnvironmentDepthCpuImage, TryAcquireEnvironmentDepthConfidenceCpuImage) are NOT supported. As an alternative, you can use Graphics.Blit() to copy textures and access them on CPU (see Texture2DSerializable.cs for example).
Video review from Dilmer Valecillos: