5.5.0b5 Holographic Remoting from Build

Installed 5.5.0b5 for hololens and have recently been able to get the holographic remoting to work on my hololens.

However this does not work in a build. Im guessing this is because we need to tell the app to connect via code? I am unable to find anything on this at the moment in the documentation.

Has anyone managed to get this to work in a build?

I am also interested to know if there are any limitations on the number of clients that can be streamed from a single app, (1 to 1 or 1 to many) and if you can still access things like the hololens camera while remoting?

Thanks!

1 Like

Hello,

Can you clarify what you are trying to do here?
However this does not work in a build. Im guessing this is because we need to tell the app to connect via code?

Holographic Simulation and Remoting only works for in the editor, I don’t think it is intended to fro use in a Holographic app.

Right now you can only connect the editor to 1 device at a time.

Thank you,
Wesley

Thanks for the reply. Hopefully I can clarify our problem.

At the moment we are running the visuals entirely from the hololens with a server doing the bulk of the processing. The new beta came out with the idea of streaming to the hololens which expands massively the capabilities. In our case we are processing the output from a music Synth, adding in hand controls via a camera system and then networking everything to the hololens.

The idea of processing everything on a PC and streaming the video is huge and means we can add many more features and incorporate more hardware. We were hoping that since it is possible to connect to the hololens from the editor the same feature would also be made available in the Unity API for use at runtime?

Thanks!

PS: Wont let me post on the other account

1 Like

That is really amazing way to use the HoloLens, and the project looks like a awesome experience. As of right now I’m not sure how to answer your question in its entirety, I think it is only exposed through the editor UI.

I will go ask the engineer to see if he can provide some more light or information.

I first want to reiterate what Wesley said – that is an amazingly cool use of the technology!

Remoting is currently a 1:1 feature in the Editor designed to aid in development. The idea of creating a standalone app that can connect to a remote device is interesting, and certainly something we are considering. I’m not sure how this would work in a 1:many scenario, however, since multiple devices would present conflicting spatial data. You could, of course, simply relay head tracking information from multiple device clients to a server right now (once you established a shared world anchor), but I guess you want more than that, such as gesture information. Still, I think some of this could be accomplished via sharing data over a net connection. What this wouldn’t give you is the ability to stream video and audio from the host, which is what I suspect you want.

1 Like

Thanks Wesley and pfreese for the reply.

The issue with audio would not affect us too much as at the moment the user listens to the synth via speakers and only hears interaction audio prompts when touching AR menus.

I think your right that having a single application stream video to multiple hololens hmds would be very difficult, I was just curious about the possibility so we could remove networking from our current system. Currently we network all the data from a server PC to each hololens so we can have multiple hololens connected and sharing an environment. We would probably expand on our current system and have each network client run on a PC which then using remoting stream the video to each hololens. This would give us a performance boost and scope for adding more features and visual effects and we would still be able to handle the audio.

We would definitely be interested in trying this feature if/when it becomes available in builds.

1 Like

@DeanStanfield @bajeo88 Yes this is a possibility that Microsoft told us about. And there is stuff available but information is very scarce and i havent been able to achieve this yet.

Here is what I know:

on Nuget we have Holographic Remoting package with native dll’s
Microsoft has a small article on how to use those in a barebone C++/CX and D3D based app. Which allows you to render a small rotating cube on your Windows 10 PC and remote it to Hololens. This technology is exactly whats integrated into Unity Editor (as unity is itself written in C++)

Now the issue for us is, We generate a C#/XAML or C#/D3D project from Unity for Hololens. That nuget package doesnt work with C# project. I also tried to generate IL2CPP project which gave me a C++/CX UWP D3D project from Unity. However the issue is that for unity to run, they generate a specific code in how the structure the app and how they control the resources. This directly conflicts with how Microsoft article is using the native dlls as that system also needs kind of control over the resources.

I have been struggling to make it work but I am not sure its possible at this point.

IF however you are not using unity, and you are creating a C++/CX project from scratch with Direct3D - then you are in good fortune… since you can use those native dlls for remoting.

1 Like

You can definitely use native DLLs with Unity. And even access rendering resources such as textures, render textures of D3D11 device from it. What exactly does that package need access to? Here’s an example on how to access those graphical resources I mentioned from a C++ plugin:

https://bitbucket.org/Unity-Technologies/graphicsdemos/src/548c5251ddbe82129b2584992a0f50caa4c34c6c/NativeRenderingPlugin/?at=default

1 Like

@bajeo88 I was blown away when I saw that video last week. Very cool stuff!

There are some issues with remoteing in. But we were able to create a companion app on a desktop and HoloLens together using xTool (Sharing service in HTK), but there is some work to get cross platform input supported properly.

In the end it was worth doing the extra work for a companion app, but I can definitely see why you’d like to tap into the extra house power of a PC.

Hi this thread is quiet old I’m currently trying to do the same. Is the sharing service in the HoloToolkit the way to get it working?

@mrwellmann Hi,

It is possible to use holographic remoting however it is not through unity.

Here is the link

Thanks for the reply @Unity_Wesley . I found that info, too, but was curious if someone succeed in adding remoting support for Unity build applications. Maybe with an IL2CPP build, the Sharing service or with the solution Tautvydas-Zilys suggested.

I would totally prefer Unity over pure c++ :smiley:

@mrwellmann Despite the old-ish thread this project is still being worked on. I believe this requested feature is currently being worked on by members at unity and may be available in a build later this year.

I was trying to run Holographic Remoting plus another application to do image analysis in the Hololens but as I understood that is not possible. I really needed to keep the Unity content from a desktop synched with the one in the Hololens.

Anybody with some thoughts on how would be the best way to do that? Could I use this
Add holographic remoting - Mixed Reality | Microsoft Learn to implement my own holographic remoting in Unity?

Thanks for the help!

Hello. We are also very interested to use holographis remoting with Unity application. We need to use PC computing power for Unity application and stream the holographic content into Hololens. However, we also need to preserve native functions of Hololens, like spatial mapping, speech and sounds. Does anyone have any progress in this problematic or any idea how to solve it?
Thank you.