ARFoundation Preview 17 Published

Hi folks,

We published ARFoundation 1.0.0-preview.17 earlier this week. See the changelog here. (Note the last public version was 1.0.0-preview.14.)

Cheers,
Tim

Still no in-editor emulation :frowning:

3 Likes

Hey @tdmowrer ! Iā€™m testing out the latest ARFoundation API / ARKit XR plugin. I have the Camera property on the ARSessionOrigin to be the same Camera I have assigned with a ā€˜screen space canvasā€™ for my various UI things, and Bad Things happen when the AR system moves the camera.

The UI canvas jiggles a bit, and the rect region of the screen space canvas is cropped in a bit. Curious if youā€™ve seen this beforeā€¦feels like the update order is out of whack since the 2D UI elements are subtly jiggling, but it might be something else because the UI is also cropped-in.

My workaround right now is to use an entirely different camera for 2d UI than for my AR camera, which is probably less than ideal.

Using a different camera for the 2D UI isnā€™t necessarily a bad idea. Itā€™s probably the way I would go. But hereā€™s why Bad Things happen in your current setup:

The default option on the TrackedPoseDriver is to update ā€œUpdate and Before Renderā€:
3683098--302755--Screen Shot 2018-09-15 at 3.53.43 PM.png
ā€œBefore Renderā€ means it will sample the device pose and update the cameraā€™s transform at the last possible moment before rendering. This produces the lowest possible latency and provides the best visual result to make the virtual objects line up with the real world. However, if you draw things in an Update function, then they will lag by a frame. One option is to subscribe to Application.onBeforeRender and do your UI update there.

But, as I mentioned, a second UI-only camera isnā€™t a bad idea.

Thanks Tim, thatā€™s very interesting. Is the ā€˜low latency renderingā€™ something new to the XR API integration? (Does that not update the same/as efficiently in the ARKit plugin on bitbucket? )

Also, how would I do an update in the way you mention on a UI Canvas? Is there something similar to Camera.Render() on a disabled camera that exists for a Canvas? I saw Canvas.ForceUpdateCanvases, but thatā€™s probably not the update loop Iā€™m looking for.

Itā€™s not really low latency rendering, itā€™s just that the cameraā€™s position and orientation is updated at the last possible moment before rendering, providing low latency pose information.

On rereading your question, Iā€™m not sure I fully understand the problem, so I suggest you submit a bug with a small project that reproduces the issue. In the meantime, if using a second UI-only camera works for you, then I would go with that.

Iā€™ll use a second camera for now, but there are reasons why I might want to use the same camera as the AR Camera in the future. Submitted a super simple bug report, case # 1082782 , that uses ARFoundationā€™s SampleScene.unity and just adds some canvas elements.

Any news regarding this issue?

1 Like

Just downloaded it to have a play around, same issue as on github - black screen issue. Tried rolling back ARCore and Foundation to 16, still just a black screen. Nokia 7+.

Same here on Pixelā€¦ Has someone figured it out?