I finally received my Vive this week (no dev kit this time unfortunately) and like the other assets in my signature, I’d like to create a high level input asset so that no coding will be required for the common input related things that you would want to do. I’ve played many of the games that are currently available to get some ideas and this is what I’m aware of being useful at the moment.
Raycasting (both gaze related and VR controller related) so that you can select menu or world items with a laser pointer
Picking objects up (both by parenting objects or introducing joints)
teleporting (both an instant teleport or one with a preview before doing so)
interacting with objects (button pushes, throwing objects etc.)
Those are the items I know for sure. There are a few other ideas that I may do (Unity GUI input module, alternative movement, etc.) but I wanted to get feedback on what other devs would find useful.
Certainly post any thing that you’d like to see to let me know.
That sounds awesome. I won’t be getting a vive for a while so I don’t have relevant input, other than that your ideas sound really cool. Will you be charging for the package? Just curious. Good luck. Can’t wait to see it in action.
Yeah generally all of my assets usually end up around $35. Usually I’m able to get a dev kit (like I did for the Apple TV) so that the full product is ready by launch day. This time I wasn’t able to so maybe sometime this summer, I’ll be getting the asset done item by item and probably release it with a subset of the final features.
Because at first it will be a subset, I’ll probably launch it at $15-$20 dollars and then raise the price when it’s feature complete. That will also be a nice bonus for those that adopt early.
I just watched a video of it, unless I’m missing something it’s just using gaze related raycasting and having some objects in the scene being marked as you’re allowed to move there. This seems similar to the lab’s countryside demo except instead of using the hand controllers you use the hmd.
I definitely will include something like this. Thanks for the suggestion.
I’m a bit confused at this suggestion can you elaborate? The Vive is room scale, so if you want to swing a sword, you simply swing the sword, but perhaps I’m missing something.
You should check out the existing interaction projects going on and figure out what needs improvement. There’s one set of examples floating around from an early game jam right after the initial dev kits came out called “vive interactions” which has a scene in it with throwable objects with velocity estimation, and a really sick mechanic for scubbing through an animation based on dragging a lever.
Then check out NewtonVR physics interaction…
Then figure out what you can improve on. As is there are basic control scripts in the steamvr package for laser pointer and access to all the buttons.
edit:: Also right now it’s kinda a pain because everyone is extending the example controllers for their own purpose, so if you use the uGUI plugin + newtonvr + something else, it can turn into a mess of having similar scripts that are slightly different and you have to merge them, then when each of them release a new update, you have to re merge everything again. It’s a pain.
This is exactly why I’m creating this. I know there are tons of learning examples (like the Steam Plugin you mention) but nothing polished and everyone tends to do things a bit different. For example the throwing example uses a fixed joint and sometimes this is helpful and sometimes you just want to parent the object. For things like picking up sometimes you want to highlight the object to indicate that it is able to be picked up. Basically, the input asset I’m making will be a set of input scripts with easy to use inspectors with options for things you would expect from a polished asset (not a learning resource). At the end of the day I hope that in addition to saving programmers time, hopefully I can open up Vive input to artists as well which currently would be a stuggle with the learning resources out there.
Not entirely input related but it would be nice to have a simple way to give yourself a simple body that you can see when looking down (no feet of course) and then arms that connect to the body out to the position of the controls.
There is generally quite a bit involved with IK so I don’t think I’ll be tackling that one. That being said Final IK on the asset store is a great product (I have no affiliation with the author) and can be made to do this with just minor integration of the HMD and controller transforms in the product. In the land of IK it works quite well for just about any problem you throw at it.
I would check out TheStoneFox on youtube. Got steamVR Toolkit on github as well which is strong…
Would be nice though with some nicely modelled hands to easily replace the controller models in vr and maybe a grip driven move function fps thingie (Should be linear and only follow your gaze as to avoid motion sickness) in case u’ get bored with that teleporting stuff all the time (even though must say teleporting works much better than i expected before trying it. Feels quite natural in vr after a while so it does not break immersion feeling the way I suspected beforehand)