The Oculus Rift experience - Tell us what you think!

Hey there Oculus Rift users!

If you happen to be a Oculus Rift user we would like to hear your thoughts about it’s current state! What do you think is great about it? Where do you think it could use optimization or enhancements? We would love for you to share your thoughts on this matter. Please understand that this is not a rant thread but rather a thread to raise awareness of current issues and limitations. So if you feel like something needs fixing let us know!

Unity Tech. officials will be keeping an eye on this thread so be precise and polite!

Thank you!

Thomas Pasieka

To start off I will bring up “Skybox” issue here. Currently, trying to use the standard skybox won’t work right out of the box in Unity. You will see “double” and it will make you a bit sick after a while. Over on the Oculus forum there are many people who created some hacks and workarounds but I don’t think that this is the perfect solution either. The member “Cybereality” on the Oculus forums seems to be the official spokes person who said this on the Skybox issue:

“The skybox thing is a bug with Unity as far as I can tell. Basically their built-in skybox does not take a custom projection matrix into account and just draws directly into the viewport.”

While it is easily possible to create your own Skybox, I’d rather see a fix that would allow for proper use of Unity’s skybox implementation.

It needs a longer cable…we use it with a kinect and need more room to move around. Wireless would be more than epic.

I think he specifically means the Unity integration.

My main gripe here is the doubling of all draw calls. I’m not strictly sure it’s necessary to perform all the work from the first frame for the second - perhaps there’s some optimisations here and as we know, speed is a problem with Rift - go slower than 60fps and it looks rough.

Some of the things may be possible to optimise such as don’t perform sorting and culling again, various redundant operations as the scene is essentially going to be looking the almost the same way twice - the saving of cpu here is probably more important for bigger rift games. Perhaps even combine mesh draw calls - all meshes twice, once offset in the same go - halving draw calls? that would be the biggest win but maybe quite tricky.

Given Epic is taking it so seriously themselves and helping out I think it would be good to have the world’s most accessible game development platform (Unity) have a little helping hand here too. Rift is being noticed because Carmack now works there. Industry never ignores Carmack.

There needs to be some multicamera or Rift-specific optimizations on the back end to help alleviate the double rendering hit. Heavy post effects are especially rough… Lot’s of assets rely on a post effect to process things and its doubling the load with the Rift and its making it hard to optimize without making major quality impacts.

Optimization is suddenly really important and very challenging working with the Rift. Not only do you need to maintain 60+ fps to avoid visual lag, you have to do it with double rendering load.

Skyboxes not working and other little things here and there is trivial compared to the performance problems.

Problem is that to get best quality you will need to make most of the rendering twice because both cameras see the world through different angles, so things like culling, post effects and so needs to be rendered twice to get a perfect picture.
The skybox issue should be pretty simple to fix i guess if they just gave it some priority.

Other than that i have not seen to many issues, shadows seems to work fine as far as i have seen both in deferred and forward mode but i have not done any extensive testing on that matter.

One thing i think they should add though is binaural audio, might not be directly related but it would add a lot to the immersion and also give then an extra edge over the competition.

One of the more jarring issues is when there are things that are only enabled in one eye, or that get culled out prematurely based on the view frustrum.

Terrain Tree’s
Point Lights
Shadows.

Things that can fall out of the view frustrum of 1 eye while still being visible in the other eye.

I have only used my Oculus Rift very shortly with Unity3D, but so far it has been a pretty good experience. My biggest gripe comes with testing, since it’s very annoying having to lower my screen resolution to fit the Rift itself so i can try it in editor and even when i do that it cannot take up the full screen. I don’t know if there are any fixes out there for that, but it’s very annoying i can’t seem to make it go full screen on the Rift when testing in the editor.

  • would be nice if it works without pro
  • GUI stuff doesnt really work with it…

The full screen game view borders should definitely go away if possible. Generally I make a build to test anything precise and deal with the wonkiness if I’m testing something quick in the editor.

I would agree on that aspect. As it stands, I have to cut down on a lot of features. I don’t even use any Post Effects due to the mentioned issue (on a side note I find the integrated PostFX to be slow in general). Shadows are also a big hog on performance from my tests. There has got to be a way to reduce the double rendering of most everything when using Rift (or at least I hope that UT is working on some sort of fix/better integration).

Eventually will this be enabled in Unity free? Carmack has said that he doesn’t have time to get up to date on the Unity code base but I guess its pretty limiting not to have it in Unity free. But I suspect sometime soon Unity free will get plugins.

I know there are many at the Unity theam that wants the Rift to work with free, so i guess we just have to wait and see : )

OVR = The namespace they use, I’m guessing it stands for Oculus VR

Take a look at how UI is implemented in the OVR Unity integration kit… it’s hell.

So here’s my big idea:

  • Make Unity’s new GUI… VR Friendly!

With GUI we’re finding the best solution to be rendering the GUI to a plane (flat or curved, curved looks best) right in front of the Rift Camera’s and adding a custom mouse cursor. There is someone that made a VRGUI system but I haven’t looked into it. If you want GUI in screenspace you have to operate in a very small portion of the screen and mirror over everything between eyes perfectly. Thats why the worldspace solution seems better, you can avoid all that.

You guys may want to cross-read between this thread here and the one I started on the Oculus forum:

Link to Oculus Rift Forum: https://developer.oculusvr.com/forums/viewtopic.php?f=37&t=6338

Has this been submitted as a bug? If so does anyone know the bug number for this?

I don’t think so. Seems like everyone assumed it was a known issue, the workaround is even on the Tuscanvy OVR demo known issue section of the Rift developer docs.

Does someone need to file a bug report?

Yeah, if this can go into the Unity bug database then it can be fixed. If it a known issue with a work-around, that’s great, but to get it fixed it really needs to be in the bug database. That’s where the developers work from.
Some info about reporting bugs here: http://answers.unity3d.com/questions/578888/how-can-i-report-a-bug-most-effectively.html

Dave,

At the risk of sounding presumptuous, is there / was there an active collaboration between Unity Tech. and Oculus VR?

If such an ‘obvious’ issue was just ignored by the community, OVR Inc., and Unity Tech… it makes me wonder what’s Unity’s plans with OVR and VR in general?

Should the users expect de-facto that OVR is going to be Unity-supported or is Unity Tech. being neutral and opening itself for any VR… ?

Thinking of it, I can see a VR package being sold as an extra license by Unity in the future also.

But I digress…

The technical side of Unity and OVR integration is the main subject, but I think a larger conversation about official VR support in Unity with a clear engagement from Unity to support it is required. Although there is still some risk at predicting VR success by end-2014, all indications point to a massive revolution (ala Smartphone) once the first headsets hit the consumer market. Seeing that developers often plan 6-12 months in advance their projects, I think it’s fair to ask those questions now.