Hi folks,
We published ARFoundation 1.0.0-preview.17 earlier this week. See the changelog here. (Note the last public version was 1.0.0-preview.14.)
Cheers,
Tim
Hi folks,
We published ARFoundation 1.0.0-preview.17 earlier this week. See the changelog here. (Note the last public version was 1.0.0-preview.14.)
Cheers,
Tim
Still no in-editor emulation
Hey @tdmowrer ! Iām testing out the latest ARFoundation API / ARKit XR plugin. I have the Camera property on the ARSessionOrigin to be the same Camera I have assigned with a āscreen space canvasā for my various UI things, and Bad Things happen when the AR system moves the camera.
The UI canvas jiggles a bit, and the rect region of the screen space canvas is cropped in a bit. Curious if youāve seen this beforeā¦feels like the update order is out of whack since the 2D UI elements are subtly jiggling, but it might be something else because the UI is also cropped-in.
My workaround right now is to use an entirely different camera for 2d UI than for my AR camera, which is probably less than ideal.
Using a different camera for the 2D UI isnāt necessarily a bad idea. Itās probably the way I would go. But hereās why Bad Things happen in your current setup:
The default option on the TrackedPoseDriver is to update āUpdate and Before Renderā:
āBefore Renderā means it will sample the device pose and update the cameraās transform at the last possible moment before rendering. This produces the lowest possible latency and provides the best visual result to make the virtual objects line up with the real world. However, if you draw things in an Update function, then they will lag by a frame. One option is to subscribe to Application.onBeforeRender and do your UI update there.
But, as I mentioned, a second UI-only camera isnāt a bad idea.
Thanks Tim, thatās very interesting. Is the ālow latency renderingā something new to the XR API integration? (Does that not update the same/as efficiently in the ARKit plugin on bitbucket? )
Also, how would I do an update in the way you mention on a UI Canvas? Is there something similar to Camera.Render() on a disabled camera that exists for a Canvas? I saw Canvas.ForceUpdateCanvases, but thatās probably not the update loop Iām looking for.
Itās not really low latency rendering, itās just that the cameraās position and orientation is updated at the last possible moment before rendering, providing low latency pose information.
On rereading your question, Iām not sure I fully understand the problem, so I suggest you submit a bug with a small project that reproduces the issue. In the meantime, if using a second UI-only camera works for you, then I would go with that.
Iāll use a second camera for now, but there are reasons why I might want to use the same camera as the AR Camera in the future. Submitted a super simple bug report, case # 1082782 , that uses ARFoundationās SampleScene.unity and just adds some canvas elements.
Any news regarding this issue?
Just downloaded it to have a play around, same issue as on github - black screen issue. Tried rolling back ARCore and Foundation to 16, still just a black screen. Nokia 7+.
Same here on Pixelā¦ Has someone figured it out?