VRTK or Native Unity VR for a project starting development in 2019

Hi all,

I’ve been experimenting with VRTK and Unity Native VR support through their LWRP VR Pipeline project template for a new VR project starting soon - seems that having both in the project doesn’t seem to cause any clash, however…

  • VRTK has a tonne of stuff already made in it like interactable objects (grab/throw), raycasting, interaction with UI with a pointer.

  • Unity Native VR may be better supported by Unity over time, however I’m either going to be making most interactable elements and UI elements from scratch or finding a way to bridge Unity’s XR Rig with the VRTK scripts somehow

What have other devs found the best starting point around Unity 2018.3~ releases for VR development?

I tried VRTK, but it didn’t meet my needs, so I went with native stuff… and haven’t regretted it. Yes, you have to write (or ask for someone to share) a bit of code for UI interaction, and your own grab/throw/locomation code (which is likely to be somewhat unique to your game anyway). But these aren’t hard, and they’re a very minor part of the total development time.

1 Like

Probably makes sense for me to get a head start and start following some tutorials on basic mechanics for VR that I can make myself for the native stuff, heck I might even be able to butcher in some of the VRTK scripts with a bit of investigation.

It’s a relief that the native stuff is good enough on its own as I’d rather not be too dependant on other SDK’s for a project I will be working on, thanks Joe!

1 Like

Every time I look into native XR all I see is some basic input mappings and device tracking. Doesn’t seem to be much info about native XR on google either when you want to solve a problem.

SteamVR etc do a ton of heavy lifting. Sadly VRTK is moving to new version and looks like they have a long ways to go.

Guess it depends on your platform goals though.

1 Like

Going to go ahead with VRTK I think, tried MRTK but it didn’t seem too user- friendly and the presets were mostly useless for a general artist use like me with only a bit of coding knowledge

Personally, I would never use VRTK again. But then again, I am a software engineer.
However, I think what VRTK gives you in dev speed at the beginning of your project, it will cost you in the end. The complexity is really just moved from code to dozens of settings and a bunch of bloated components. And it still limits you in what you can do. Once you need something very specific (and you probably will) you are blocked by it. But again, this might just be my personal preference.

Have you considered SteamVR? The 2.0 version is not half as bad as the one before.

Anyway, best of luck with your project!

3 Likes

Opting to use SteamVR 2.2 for a demo of a scene I am building now as VRTK is undergoing some heavy development at the moment for the v4 version funded by Oculus but documentation is pretty light for that at the moment.

Further down the line I will have to decide on the SDK for the chosen platform of the app and I wish that there was something as comprehensive as SteamVR for cross-platform (I can only dream!).

If not then the Unity Native XRrig seems to work with every headset I have tried so far so like JoeStrout mentioned I could just work on input mapping for that and basic locomotion (which I’m writing myself anyway) and go from there.

2 Likes

I’ve been researching this very topic quite a bit recently. :slight_smile:

It really depends on what functionality you need. Do you want one package that does everything, or do you want to assemble your own custom solution out of separate pieces?

If you’re looking for an all-in-one solution, there are only two choices that I’ve found.

VRTK is a “swiss army knife” that does everything. I’ve used VRTK version 3 on a project, and actually wrote a book chapter on creating multi-user social environments built with VRTK and PUN.

However, I agree with @OleJuergensen that VRTK 3 is fine if you do things their way, but if you ever need to do things differently then it will fight you every step of the way. I ended up spending several days ripping VRTK out of my project and writing my own. Much happier with the result.

You should also know (and I’m sure you do already) that VRTK 3 was abandoned by the developer, and since it doesn’t support the current version of SteamVR it’s really a dead end. Oculus has paid that developer to spend six months developing VRTK 4, but it’s still missing a lot of features and has very limited documentation. There’s also the question of what happens when the money runs out.

The SteamVR Interaction Toolkit is awesome. It’s well-designed, and appears to be inspired by NewtonVR. I’ve used it on one small project, and was very happy. It’s also being supported by Valve, so it won’t be going away anytime soon. However, the downside is that it’s tightly integrated with SteamVR 2. If you’re developing for another platform (e.g. Quest) it’s simply not an option. The amount of work you’d have to do to generalize it is just not worth it.

The trouble with these all-in-one solutions is that they’re hard to customize for your game or port to new platforms. So, if you’re looking to assemble something yourself, what parts do you need? Here’s a list…

Input Abstraction. You want to be able to move your code from platform to platform without having to completely rewrite everything. That’s the idea behind the Unity XR input system, which is pretty good. However, it doesn’t seem to play nice with SteamVR. If you make even one call to SteamVR 2, the Unity Input system stops recognizing your device inputs. Not sure if that’s Unity’s fault or Valve’s, but it’s definitely annoying.

Object Manipulation. The code for grabbing and throwing virtual objects is just complicated enough that you don’t want to have to implement it yourself if there’s a good off-the-shelf option. There are several methods of picking up an object (re-parenting it to your hand, or using a physics joint, or applying velocities to have the object follow your hand). Ideally, you want a library that lets you choose between them, possibly even on a per-object basis.

There are several candidates in the object manipulation category, and they all work pretty much the same way – there’s a component you add to each of your controllers, and a component that you add to each object you want to interact with. Works well.

However, almost all of the toolkits that provide this functionality have been abandoned and have no support for SteamVR 2. In this category are Newton VR, ViveGrip and EasyGrab. The only one that seems to be actively supported is VR Interaction. VR Interaction has its own input abstraction component called VR Input, which has support for Oculus, Steam VR 1 and partial support for Steam VR 2. I’m experimenting with it, and so far it looks okay. The developer is very responsive, which is nice.

Some also offer haptic feedback, and some can automatically generate sounds when objects collide. All very useful.

Higher-Level Interactables. Things like doors, drawers, knobs, levers, sliders, buttons and so on. A package that provides these can be a real time-saver. All the packages I’ve mentioned above have support for these, except (unfortunately) VR Interaction, which is the only “lightweight” package that’s actively supported.

User Interface. Interacting with traditional Unity interface elements, usually with some kind of virtual laser pointer. I haven’t researched the options yet, but I’ve tried a number of solutions and it doesn’t seem like a complex problem. There’s one called VR UIKit that uses the VRInput abstraction layer from VR Interaction, but I haven’t tried it yet.

Locomotion. This tends to be so game-specific that you’re better off rolling your own. Obviously there’s simple controller-based movement that you can implement yourself in an hour or two. There are also more interesting things like arm-swinging and walking-in-place. It’s really up to you, and there are several packages in the Unity asset store to use as a jumping-off point. Some packages also offer motion sickness mitigation techniques, such as vignetting.

Teleportation. If your game uses it, it’s pain to implement yourself. There are several packages in the asset store, including an Arc Teleporter that I’ve had good success with. It also uses the VR Input abstraction layer from VR Interaction.

Miscellaneous. In this category are things like fading the scene to black and back again during level transitions, or detecting when the user has put or or removed their headset. You can either roll your own or find small packages that do the job. For screen fading, there’s one called simply Screen Fade that’s cheap, works well and is completely cross-platform.

Anyway, good luck. A lot of us are all looking for similar tools, and fortunately things are maturing quickly.

16 Likes

Thankyou so much for putting in the effort to write this up, it mirrors a lot of the things I have been finding myself, the object manipulation recommendations are exactly what I needed.

For the interests of my project I am using VRTK v4 as it hasn’t been interfering with anything else I have been coding for a prototype application. However later down the line I am going to need to think about either sticking to a platform SDK or ripping out VRTK v4 and going fully base unity with minimal features that will interfere with current/future platform features.

Just wishing the decision of what to use was easier or that there was a more long-term solution when VRTK v4 funding could cease and support dropped again like v3 :open_mouth:

Fired up Unreal Engine recently to check out their VR setup, they have preset teleportation and locomotion setups that work out the box with any headset, very impressed and a bit disappointed that Unity doesn’t have a simple VR setup for new projects like it!

2 Likes

Even a simple blank demo scene just with locomotion and vr hands, using xr, would be a very welcome thing.

Hey guys,

Have a similar query, you can read a bit of some of my fumblings here…

After reading the options you’ve put @BernieRoehl I’m a bit sad, had hoped to use VRTK as a good reference but I did notice the v3 / Steam and beta v4 situation which has made me wonder if its the best way to go.

An reading through the rest it seems like there will be a fair bit of fumbling for me as a newbie to get up and running. Wasn’t expect such a challenge to have some basic mechanics to learn from and adapt using the built-in stuff, maybe my expectations are wrong?

I went through the the new ‘Unity Learn’ trial and was excited to see a whole XR section and thought it would take me through and get me up to speed. If you go into it you’ll find a best practises which is good and then a bunch of stuff around VRTK (v4 beta), nothing with their own ‘native Unity XR’ and how to use it ?!

Whats not helping is like @ROBYER1_1 I’ve fired up Unreal and within 5 minutes I had a VR template up and running with nice mechanics working based off their built-in VR support, 0 issues.

Maybe Unity will get things a little more mature this year, I really would like to use it but I don’t have unlimited time and things just work out the box with Unreal. I’m not saying that Unreal will be easier / better for my projects and as it stands I know more about Unity by far so suspect if I run into issues I will be able to solve them more easily in Unity but who knows.

Choices…choices…

Found this which when reading through the docs can use either Oculus / Steam / GVR Integrations OR Unity’s native XR support and might be a good reference point for a lot of VR things? Very recently updated and looks well supported at present?

1 Like

@Matt_D_work do your team have any thoughts on this? Getting some useful user feedback on VR templates here!

Some more feedback for you @Matt_D_work

Right now as a lone dev with some experience with Unity the learning process has gone like this for me…

  1. Look the official manual and not Unity has built in XR support so no need for 3rd party bits from Oculus or Value, great now I have my HMD tracking by ticking a single box and that it awesome to be honest.

  2. I want to get my controllers involved so read the XR input section, find the input mappings. After some googling around stumble on someone that shows how the mappings work as its not that obvious (remember, I have some experience, not lots). Now I can get a button working, awesome.

  3. Now, how can I track my controllers? More googling as I can’t see anything in the manual. Find older articles and videos showing something call the ‘Tracked Pose Driver’. Download example project and all refs to Tracked Pose Driver can’t be resolved. More googling and find out for some reason its in a package called ‘XR Legacy Input Helpers’. That can’t be right as its ‘legacy’, whats the current supported way?

Come into the forum and read around for an hour or two and start to realised it is the current way to track devices but for some reason its called legacy. Ahhhh that must be because of the new input system I keep hearing about, great I’ll use that except its not production ready, nor is there helpful docs of any kind that make to me anyway.

  1. Take a step back and think, ok for the moment maybe I should go with the official Oculus integration to make life a bit smoother whilst I’m learning, they should have good docs etc. Download and import the asset / integration and open the locomotion example and its not working. I have no hands / controllers in VR. More googling and find out I need to enter Avatar ID’s which isn’t mentioned anywhere I could find.

So now I have hands, great. Start trying to teleport and instantly fall through the floor. More googling and discover snippet of advice regarding timestep to be synced with your HMD refresh rate. Do this and now I’m not falling through the floor. Result! Except now the teleporting doesn’t work randomly sometimes and just gets stuck and there’s a weird delay before it. More messing with the locomotion component and think I’ve partially fixed things but its still not perfect.

An that’s the story so far, I feel like I’m fighting to get to the start line. I’ve been a dev in one way or another for years, have C# experience (albeit a bit old) and am not new to the Unity editor and so on (but no expert either, 3D is new for me, 2D work previously) so I totally understand problem solving is part of things but wasn’t expecting this to be honest.

I also read that the way SDK’s are powering the bultiin XR support is now changing and I need to set things up in a new way, well I THINK thats what it was saying ( XR Plugins and Subsystems ), just felt like another confusing part of things.

In-between I’ve been trying out UE4 which I had 0 experience with and have had a much better experience. I fire up the VR template included and I’ve got height control, locomotion, basic interaction etc in blue prints, nicely commented etc. I read this works out the box for all major HMD’s with no effort. The UE4 VR editor works fine as well, I did try Unitys VR editor but it failed to even start, reading around lots of issues etc. I know Unitys VR editor is experimental but UE4’s is marked the same yet works out the box. Manual looks well written for VR for UE4. Also, forum threads like this…

https://forums.unrealengine.com/development-discussion/vr-ar-development/15238-getting-started-with-vr

I’ve also been shown an educators guide for UE4 thats free for VR containing best practises and setup that is exceptionally well written. An not directly something Epic have done but an up-to-date book around VR in UE4 as well, I couldn’t find the same for Unity of the same quality (and things you do find some want you to use the 3rd party SDK’s, not a lot if anything for the builtin stuff).

I’m not trying to point out how wonderful things are with UE4 and I’m sure there’s lot or trade-offs and its fair share of problems but these are things that if Unity had would really help us get going. For me and my VR projects at the moment I’ve paused with Unity and am seeing how I can get on with UE4, might be a terrible mistake but so far that’s not been the case.

** EDIT ** I also tried to take the survey here Unity Product Survey - AR/VR as your asking for XR feedback specifically etc but it doesn’t work and tells me its ‘invalid’

** MORE EDITS ** I created a VR template from scratch in Unity to see how that went, learned a lot by removing everything out my mind I was expecting and enjoying things a lot more by using the built-in Unity XR features. UE4 progress is slow and now I’m over the initial bump with Unity I have to say I’m progressing nicely and enjoying things.

1 Like

Fully agree with the above. I also had a pretty tough time trying to get started with VR using XR. It’s a nightmare trying to deal with sitting/roomscale recentering of cameras, fighting with cameras which transform themselves, and so on. The actual API isn’t so bad and once its up and running its actually quite easy, but there’s some horrible time wastes and trash to wade through just to get there. I thought the point of a game engine was to support us in developing? This is one area where Unity clearly fails in that ambition.

1 Like

@Innovine Good to know I’m not alone in the struggle but not good to go through it all.

I suppose what really dented my confidence with Unity is I have an expectation of things like the the built-in VR support and Oculus Integration (for example) to work out the box. When you immediately run into issues in an example scene (Oculus Integration and made to demo a concept) the first thing that’s starts running through my mind is, if this basic scene and example has got issues in it, how many other problems will I have? I had the same thoughts with the built-in Unity stuff, if there doesn’t seem to be a coherent approach what state is everything in?

To counter that I see many awesome things being made with VR and Unity but is it tons of pain first as you stumble through what could be a much more smooth process? An I thought exactly what you put about the engine helping support us. Its doing some amazing things but is almost like nobody’s given any thought to the process as a whole.

I don’t want to dismiss Unity as an option and I invite any of the Unity team to make contact so I can explain things to help improve the on-boarding process, at least from a newcomers perspective. I’ll keep on tinkering in any free time with Unity but for now I’m more commited trying out UE4 and in the spirit of being fair I will report back on that as well, nothing saying something can’t be learned and applied across engines.

I can only agree with what you guys are saying, Unity seems strangely inactive / not bothered with VR. EditorXR seems dead. LWRP was recently made “production ready” but I swear it wasn’t working in VR properly at the time? I’d appreciate the stuff unreal has, but seeing has Unity has semi abandoned the 3rd person character controller thing they were working on doesn’t give me a lot of hope.

I am though hopeful for XRTK (cross platform fork of MRTK). I’m in the discord and they’re working super hard on it every day.

Virtuoso VSDK (based on VRTK 3) also seems interesting, but it’s just been released and I haven’t tried it yet.

I think there’s a whole bunch of unity features that got pushed out half-baked and then mostly abandoned in the last two years. I’m wondering if they’re now driven by interesting sounding ideas on paper, rather than serving us a quality platform to work with. Where’s all the ecs stuff? Wheres the srp for production? Where’s camera stacking? The nested prefabs it a bit halfbaked. Multiplayer?. Gonna re-write the input system againnn?? and xr of course.

Yes LWRP completely fails out the box with VR to the point the editor itself renders in stereo by look of it on your desktop. I’ve seen a few report the same, I mean since one of LWRP’s main use cases is VR (as I understand) and no-one tested it? The fix is to change the stereo rendering mode to single pass. Heres someone with the issue…

All I’ve learned so far is how to fix issues surrounding Unity and the Oculus Integration, some if it useful but most just silly things I didn’t want to or need to know right now. I had hoped to be making basic stuff in VR and growing those skills.