I am doing research on Windows Mixed Reality Headset (WMRH) headsets as well as the integration process into Unity if there is anything available yet. I am trying to identify all the aspects of these devices which can affect the development process. I already have experience in developing to the Hololens so I believe I know some of the process already.
Will the building process be similar to Hololens? Create/Update the visual studio project and from there deploy to device?
The WMRH platform allows hardware manufacturers to roll out their own headsets and controllers now currently counting Acer and HP. From a developers point-of-view will their be any technical differences we should take into account? From their corresponding product pages their specifications looks close to identical.
Because WMRH technically is just an ordinary VR headset but with inside-out tracking, similar to the Hololens is the idea still to use “WorldAnchors” in the project to anchor objects?
If you know of any other factors that can affect development one way or another, I would love to hear it.
I have been trying get my minimal Unity app working on an Acer Mixed Reality HMD (https://www.microsoft.com/en-us/store/d/acer-windows-mixed-reality-headset-developer-edition/8pb4twx13m2n) for a while now. But I cannot run my app on the device. I have been able to build the app in Unity, then deploy the resulting solution (which builds an UWP app which is then available in my Win 10 start menu). Running the app from there just runs in on the desktop. Trying to run it from the Mixed Reality Portal does not really run the app - it just stays white after placing the “app board” in the virtual world (just like apps are placed on HoloLens to real world).
Does anyone know if there are any tutorials for these new Mixed Reality devices? The page linked above just directs to the Microsoft Dev Center main page, and all Mixed Reality tutorials available there seem to be for HoloLens, which apply only partly to devices like Acer Mixed Reality HMD.
I’m in the same boat. I received the HP version of the Mixed Reality Headset two days ago but I don’t really know where to start. I’ve been doing some SteamVR (with VRTK) development in Unity so I was hoping for a similar kind of asset and examples to get started.
I’ve even searched the Microsoft site for any tutorials related to this but so far have not seen any. It’s rather disappointing in how much support Microsoft is giving this product (even if it’s a dev edition). Someone should be doing a better job at product management -especially since the Vive and Rift have so much market penetration already!
They really need some good examples to capture hearts and minds.
I received my Acer headset, and created my first VR game.
The quality is amazing , better than my HTC Vive , Oculus and Gear VR + S8.
The only problem I have is sometimes when I rotate my head, the camera doesn’t move correctly
I eventually got it to work on Windows Insider build 16257.1.
Does anyone know of an example where it uses the two front cameras on the headset?
Also how does it actually track the location of the headset? I initially thought it used the glow of the screen as a reference point but since I can turn around and not face the screen at all, and still track that doesn’t seem to be the case. Does it take pictures of the features inside the room and use object recognition to figure out orientation? Is there a digital compass or accelerometer?
it has SLAM mapping which generates 3D point cloud data from corners/edges seen in stereo pair, this is done in dedicated hardware I think. The SLAM is enough on its own to give solid position and rotation approximation, although with relatively high latency. I’ve heard you should put posters on large blank walls: give SLAM something to look at.
This latency is reduced way down by fusion of high performance accel/gyro signals.
So you get best of both worlds: solid wordspace tracking with low latency. Plus the system is aware of real world colliders, so you can skin the real world.
So who’s gonna be the first to turn any living room into a fully skinned space station etc…?
I have faced tracking issues a lot (boundary getting lost) when debugging. Any solutions to this? And there is this WIN+Y key to switch from Headset to Desktop and Vice versa. It becomes so annoying. Any fixes to this?
I’m trying to create a seated exprience using HP headset. But XRDevice.SetTrackingSpaceType(TrackingSpaceType.Stationary) returns false.
Does anyone know if the headset supports only room scale expriences?