I noticed what I believe is a bug with the Poke interaction in XRI -
We have a UI that appears by starting very small (like 0.0001 scale), and then scaling up to full size, within that UI are poke buttons. A script controls this UI: On enable, it activates the gameobject of the UI, and sets the scale to small and then starts the tween.
Because the Poke button’s first activation is happening while the UI is scaled down, it calculates the interactionAxisLength using its world bounds which is very small, and there is no way to force it to recalculate from what I can tell.
Our workaround is to have the UI activated at full scale initially, and then delay a frame before scaling it down. I believe a better fix would be a change to XRI to not use the world bounds, or allow us to set a fixed interactionAxisLength.
The relevant code within the package is:
XRPokeLogic.Initialize which calls XRPokeLogic.ComputeBounds but does not pass an argument for targetSpace so it defaults to Space.World and then passes the result to XRPokeLogic.ComputeInteractionAxisLength
The reason that we cannot simply activate the UI and then scale it down (without a frame delay) is because XRPokeFilter does its Setup during Start which doesnt happen synchronously when we activate the GameObject. We may also be able to work around it by setting the pokeCollider property which is public and calls Setup.