to subscribe to a /raw_image topic and can successfully see the stream when I create a UI → Raw Image object in unity. Although I’m thrilled that I can at least see the camera feed, I am struggling to convert this stream into the MRTK/HoloLens scope of things. I ideally would like to add the image texture’s to something like MRTK’s slate prefab to make it manipulatable. Any help, guidance, tutorials, or direction would be greatly appreciated. I’m still new to all of this and still learning
Hi, I won’t be able to give you a full answer as my experience with VR/AR is limited and I have no experience with HoloLens. The only things I can think of that might help you right now are these:
If you’re already able to stream the messages to a UI → Raw Image Image object that’s inside a canvas, make sure to change the Render Mode of that canvas to World Space (or create a new one for this).
If it’s in screen space, it is (as far as I know) ignored by the HMDs. If it’s in the world space, you can place the canvas anywhere in your scene and you’ll see it in the VR/AR. You’ll be able to do things like make the canvas always rotate towards the user, or even attach it to your virtual hand. You can use it the same way as any canvas and display any UI.
If you’ll still struggle, it might be helpful to post somewhere to the AR/VR (XR) Discussion forum, as people there should be able to help with this more.