Struggling to Stream /Image_Raw msgs to a MRTK prefab

Hi,

I am currently developing an app for the HoloLens 2 and struggling to find a GameObject to attach a Raw Image to to stream a camera feed from a Raspberry Pi to the app. I followed @maciejw94 's code found here:
Question - How to Place ROS Image Messages in Scene and View in HMD? - Unity Forum

to subscribe to a /raw_image topic and can successfully see the stream when I create a UI → Raw Image object in unity. Although I’m thrilled that I can at least see the camera feed, I am struggling to convert this stream into the MRTK/HoloLens scope of things. I ideally would like to add the image texture’s to something like MRTK’s slate prefab to make it manipulatable. Any help, guidance, tutorials, or direction would be greatly appreciated. I’m still new to all of this and still learning :slight_smile:

Hi, I won’t be able to give you a full answer as my experience with VR/AR is limited and I have no experience with HoloLens. The only things I can think of that might help you right now are these:

  1. If you’re already able to stream the messages to a UI → Raw Image Image object that’s inside a canvas, make sure to change the Render Mode of that canvas to World Space (or create a new one for this).
    8799409--1196431--upload_2023-2-12_10-3-27.png
    If it’s in screen space, it is (as far as I know) ignored by the HMDs. If it’s in the world space, you can place the canvas anywhere in your scene and you’ll see it in the VR/AR. You’ll be able to do things like make the canvas always rotate towards the user, or even attach it to your virtual hand. You can use it the same way as any canvas and display any UI.

  2. If you’ll still struggle, it might be helpful to post somewhere to the AR/VR (XR) Discussion forum, as people there should be able to help with this more.

Good luck!

1 Like

Thanks again for taking the time to help out, @Envilon !!!