Windows Mixed Reality Headset - Information?

Hi Uniteers,

I am doing research on Windows Mixed Reality Headset (WMRH) headsets as well as the integration process into Unity if there is anything available yet. I am trying to identify all the aspects of these devices which can affect the development process. I already have experience in developing to the Hololens so I believe I know some of the process already.

If you know of any other factors that can affect development one way or another, I would love to hear it.

1 Like

I’m honestly curious to have all of these questions answered as well.

I am looking for answers related to WMRH as well!

I have been trying get my minimal Unity app working on an Acer Mixed Reality HMD (https://www.microsoft.com/en-us/store/d/acer-windows-mixed-reality-headset-developer-edition/8pb4twx13m2n) for a while now. But I cannot run my app on the device. I have been able to build the app in Unity, then deploy the resulting solution (which builds an UWP app which is then available in my Win 10 start menu). Running the app from there just runs in on the desktop. Trying to run it from the Mixed Reality Portal does not really run the app - it just stays white after placing the “app board” in the virtual world (just like apps are placed on HoloLens to real world).

Does anyone know if there are any tutorials for these new Mixed Reality devices? The page linked above just directs to the Microsoft Dev Center main page, and all Mixed Reality tutorials available there seem to be for HoloLens, which apply only partly to devices like Acer Mixed Reality HMD.

I’m in the same boat. I received the HP version of the Mixed Reality Headset two days ago but I don’t really know where to start. I’ve been doing some SteamVR (with VRTK) development in Unity so I was hoping for a similar kind of asset and examples to get started.

I’ve even searched the Microsoft site for any tutorials related to this but so far have not seen any. It’s rather disappointing in how much support Microsoft is giving this product (even if it’s a dev edition). Someone should be doing a better job at product management -especially since the Vive and Rift have so much market penetration already!

They really need some good examples to capture hearts and minds.

You can start here : Install the tools - Mixed Reality | Microsoft Learn and follow the instructions.

To resume, you’ll need :

  • Windows 10 Insider preview build
  • Unity 2017.2 Beta - or Unity MRTP (MR Technical Preview) if you have access to it, with Windows Store .NET Scripting Backend
  • Windows 10 Creator Update SDK, Visual Studio

There is an official forum and slack too.
The docs and spreaded infos are quite a mess at the moment in my opinion, hang on :).

1 Like

Thanks Gruguir! I’ll give that a try tonight.

I received my Acer headset, and created my first VR game.
The quality is amazing , better than my HTC Vive , Oculus and Gear VR + S8.
The only problem I have is sometimes when I rotate my head, the camera doesn’t move correctly

I eventually got it to work on Windows Insider build 16257.1.

Does anyone know of an example where it uses the two front cameras on the headset?

Also how does it actually track the location of the headset? I initially thought it used the glow of the screen as a reference point but since I can turn around and not face the screen at all, and still track that doesn’t seem to be the case. Does it take pictures of the features inside the room and use object recognition to figure out orientation? Is there a digital compass or accelerometer?

That use object recognition more or less with edge detection , but also use compass and accelerometer, gyroscope.

Yes, to paraphrase @zugsoft :

it has SLAM mapping which generates 3D point cloud data from corners/edges seen in stereo pair, this is done in dedicated hardware I think. The SLAM is enough on its own to give solid position and rotation approximation, although with relatively high latency. I’ve heard you should put posters on large blank walls: give SLAM something to look at.

This latency is reduced way down by fusion of high performance accel/gyro signals.

So you get best of both worlds: solid wordspace tracking with low latency. Plus the system is aware of real world colliders, so you can skin the real world.

So who’s gonna be the first to turn any living room into a fully skinned space station etc…?

You just need to write this function:

private void SkinCurrentEnvironment ( string[] asTextureTilesToUse )
{
   //SOME CODE HERE
}
1 Like

This very easy tutorial for mixed reality unity 5.6.x and vuforia

1 Like

I have faced tracking issues a lot (boundary getting lost) when debugging. Any solutions to this? And there is this WIN+Y key to switch from Headset to Desktop and Vice versa. It becomes so annoying. Any fixes to this?

I’m trying to create a seated exprience using HP headset. But XRDevice.SetTrackingSpaceType(TrackingSpaceType.Stationary) returns false.
Does anyone know if the headset supports only room scale expriences?