(I made a similar post on the OculusVR forum too, sorry if you’re seeing it twice… haven’t had any responses there yet)
Hi, I’ve been working on my VR game for a few months, I started out using the Unity/Oculus “design, develop, and deploy for VR” tutorial as a base and it’s been pretty good for setting up teleporting, and grabbing things. So far, like the tutorial, I’ve used some Oculus SDK objects/prefabs and some from VRTK 3.3.
But now I want to add more things and I’m not sure which is the best approach going forwards. For instance, one thing I’m trying to do is distance grabbing of objects. I notice in both the Oculus and the VRTK samples there are distance grabbing examples, but also in the XR plug in system. I have managed to get most of what I need using Oculus distance grabbing, but I’m a little concerned of being over-committed to one platform (I’m developing for Quest at the moment)
I do see a ton of useful stuff in the VRTK documentation, but not clear how to use most of it. I’ve also seen posts suggesting using Unity’s XR management for some things and Oculus SDK for others, I’ve tried to experiment with that too, but run into problems trying to use them together since they both manage things independently.
Can anyone advise me on which SDKs they’d recommend to use now? My sense is I’d like eventually to move to the XR management system at some point but many people say it’s not quite ready yet. I also want to integrate hand tracking as soon as I have my scene fully working, am I forced to use the Oculus SDK for hand tracking to work?
I’d also appreciate any guidance on how to learn from whatever is recommended, whether it’s picking apart the samples, following Youtubers or managing from documentation sources?
Sorry if this post is a bit rambling, that’s also a bit how I feel as I spend hours searching around for stuff and going a bit in circles… any help/pointers would be much appreciated!
Thanks!