I’ve been researching this very topic quite a bit recently. 
It really depends on what functionality you need. Do you want one package that does everything, or do you want to assemble your own custom solution out of separate pieces?
If you’re looking for an all-in-one solution, there are only two choices that I’ve found.
VRTK is a “swiss army knife” that does everything. I’ve used VRTK version 3 on a project, and actually wrote a book chapter on creating multi-user social environments built with VRTK and PUN.
However, I agree with @OleJuergensen that VRTK 3 is fine if you do things their way, but if you ever need to do things differently then it will fight you every step of the way. I ended up spending several days ripping VRTK out of my project and writing my own. Much happier with the result.
You should also know (and I’m sure you do already) that VRTK 3 was abandoned by the developer, and since it doesn’t support the current version of SteamVR it’s really a dead end. Oculus has paid that developer to spend six months developing VRTK 4, but it’s still missing a lot of features and has very limited documentation. There’s also the question of what happens when the money runs out.
The SteamVR Interaction Toolkit is awesome. It’s well-designed, and appears to be inspired by NewtonVR. I’ve used it on one small project, and was very happy. It’s also being supported by Valve, so it won’t be going away anytime soon. However, the downside is that it’s tightly integrated with SteamVR 2. If you’re developing for another platform (e.g. Quest) it’s simply not an option. The amount of work you’d have to do to generalize it is just not worth it.
The trouble with these all-in-one solutions is that they’re hard to customize for your game or port to new platforms. So, if you’re looking to assemble something yourself, what parts do you need? Here’s a list…
Input Abstraction. You want to be able to move your code from platform to platform without having to completely rewrite everything. That’s the idea behind the Unity XR input system, which is pretty good. However, it doesn’t seem to play nice with SteamVR. If you make even one call to SteamVR 2, the Unity Input system stops recognizing your device inputs. Not sure if that’s Unity’s fault or Valve’s, but it’s definitely annoying.
Object Manipulation. The code for grabbing and throwing virtual objects is just complicated enough that you don’t want to have to implement it yourself if there’s a good off-the-shelf option. There are several methods of picking up an object (re-parenting it to your hand, or using a physics joint, or applying velocities to have the object follow your hand). Ideally, you want a library that lets you choose between them, possibly even on a per-object basis.
There are several candidates in the object manipulation category, and they all work pretty much the same way – there’s a component you add to each of your controllers, and a component that you add to each object you want to interact with. Works well.
However, almost all of the toolkits that provide this functionality have been abandoned and have no support for SteamVR 2. In this category are Newton VR, ViveGrip and EasyGrab. The only one that seems to be actively supported is VR Interaction. VR Interaction has its own input abstraction component called VR Input, which has support for Oculus, Steam VR 1 and partial support for Steam VR 2. I’m experimenting with it, and so far it looks okay. The developer is very responsive, which is nice.
Some also offer haptic feedback, and some can automatically generate sounds when objects collide. All very useful.
Higher-Level Interactables. Things like doors, drawers, knobs, levers, sliders, buttons and so on. A package that provides these can be a real time-saver. All the packages I’ve mentioned above have support for these, except (unfortunately) VR Interaction, which is the only “lightweight” package that’s actively supported.
User Interface. Interacting with traditional Unity interface elements, usually with some kind of virtual laser pointer. I haven’t researched the options yet, but I’ve tried a number of solutions and it doesn’t seem like a complex problem. There’s one called VR UIKit that uses the VRInput abstraction layer from VR Interaction, but I haven’t tried it yet.
Locomotion. This tends to be so game-specific that you’re better off rolling your own. Obviously there’s simple controller-based movement that you can implement yourself in an hour or two. There are also more interesting things like arm-swinging and walking-in-place. It’s really up to you, and there are several packages in the Unity asset store to use as a jumping-off point. Some packages also offer motion sickness mitigation techniques, such as vignetting.
Teleportation. If your game uses it, it’s pain to implement yourself. There are several packages in the asset store, including an Arc Teleporter that I’ve had good success with. It also uses the VR Input abstraction layer from VR Interaction.
Miscellaneous. In this category are things like fading the scene to black and back again during level transitions, or detecting when the user has put or or removed their headset. You can either roll your own or find small packages that do the job. For screen fading, there’s one called simply Screen Fade that’s cheap, works well and is completely cross-platform.
Anyway, good luck. A lot of us are all looking for similar tools, and fortunately things are maturing quickly.