i was trying out the new ARKit Plugin.
I got Unity 2017.1 and a Iphone 7 Plus with IOS 11.1 beta.
ARKit Remote Debugging is working nice.
Problem: i want to run Apps in the Editor and use the Iphone for the AR capabilities.
When i run the example projects, they are not working. Because they depend on touch inputs.
So basically i rewrote those scripts to use mouse input instead.
I also looked at the “UnityARSessionNativeInterface”. Lot’s of code is useless in “UNITY_EDITOR” mode. For example the HitTest API (return just an empty object without a notice). So the ARKit Example is useless in the Editor, although the ARKit Remote Plugin is running.
Am I missing something?
How should i code an app without deploying it everytime manually on the device with xcode?
Whats the point of the ARKit Remote Plugin if i can’t use ARKit in the Editor?
(updated to answer question, which I missed first time)
I think what you are looking for is ARKit Remote and hittest using the editor. All is included in the ARKit unity package. I have worked with it for the last months and even if it is not perfect it is sometime nice to debug with AR capabilities in the editor. I would say that ARKit remote isn’t perfect but a lot more convinient than building the app all the time
You can simulate touch inputs with mouse in ARKit Remote basicly clicking in the editor to simulate touch input on device. Which uses EditorHitTest.cs to make it possible. The only odd thing is to click on the gameview in unity and not the actual device:)
Link to blogpost introducing ARKit Remote:
Hope it helps! And that the updated answer is actually answering the question