So this forum is full of super smart folks like you who likely know way more about VR than I do! but we also realised that there are a lot of people out there who haven’t dabbled in VR yet, and may well want to get started. So we created a set of sample mini-games and a menu that we call VR Samples (does what it says!). The project also has some re-usable sample components and completely commented code throughout.
We have begun by supporting Oculus DK2 and the consumer GearVR so if you have a Samsung S6 or Note5 you can jump on this right now and try it out. Download it from the asset store here -
We worked with our good friend Iestyn Lloyd ( @yezzer ) on this project and he also put together a set of articles to help you get started on our Learn site here -
Thank you for that work !
Honestly I think it can be as helpful to new VR developers than to VR “pro” developers (like me). Actually, I have been dealing the past months with just the same issues you resolve here (like the “hold to confirm interaction”, or UI display suitable to VR, or optimization in text display). And my solutions were far less beautiful than yours
Thank you for making it easy for anyone to jump into VR
Awesome work Unity Team, thank you! Is there a way to use this with non-GearVR phones like a Samsung S5, the Google Tango device, or other deivces? Possibly we can set VRDeviceType.None, VRDeviceType.Stereo, or VRDeviceType.Split, and in this case I would think the Oculus SDK is not needed so I could avoid the error: “[VRDevice] Initialization of device oculus failed.”
We develop VR and teach VR development and those without the GearVR phones readily available would not be able to jump in on these demos.
Is there a way to use UnityEngine.VR without the Oculus SDK for non-Oculus Development? For example, we want to render VR but don’t need any of the Oculus distortion corrections running.
Just when I was diving into controlling Gear VR with touchpad comes this along. Very Thorough and perfectly understandable even to a senile old cobol programmer. Would be great to also have an example of how to best utilize the touch pad on gear VR in a FPS game. Have got some swiping on parent of camera to function but not quite as nice as in the child adventure game Finding VR. Would love to se a script implementing Gear VR touchpad functionality similar to the way it’s done in this title.
Thank you so much guys, this is huge! I just have one small request. Do you think you could also do an additional scene with various UI elements that you select for the Gear VR? I think what I am trying to describe is a character selection screen that shows various animated models. Perhaps when you select a button a character will appear and and their animation will auto play, and walking sound will play. If you select another button, another character would be selected. If I could do this simple thing I think I could do anything! I have spent weeks and still can’t make this happen
PS. Another nice thing would be an intuitive FPS movement/interaction script via only the Gear VR touchpad like the poster above mentioned. Maybe also a fun project with an object interaction script that works similarly to that DK2 demo “I expect you to die” that was a pretty awesome object interaction idea without having motion controllers available… and since Gear VR won’t have input like that for a while it could be really useful.
I am kind of glad that some systems I was using (like the crosshair solution, or the input from view) has been now been made directly from you (and in a more fashionable way). I’m going to update my systems to work from your new basis, and I will use your scripts as a starting point for good practices.
@willgoldstone – thanks for your work on VR support and for the examples. So at this point, is there any reason to use the Oculus Utilities for GearVR development? The built-in support seems complete, but I am wondering if I miss out on anything by not using the Oculus code. Thanks!
@stevenbrent , it doesn’t appear to have all the FPS movement stuff or various key mappings that the Oculus Utilities includes. So may still need it for the time being. I hope it does not conflict with the Utilities but I will find out tonight.
I’m guessing something similar to Oculus Utilities will get fully integrated by default eventually? Hopefully one day it will be with multiple options for locomotion, like either standard fps movement, teleportation, full tank modes and “go where you look”, comfort turning modes, etc. Basically many of the settings that are preferences for users in VR that many of us are not experts on and would have no idea how to program for.
Thanks, @brentjohnson – Seems like the prefabs provided with the Oculus Utilities (OVRPlayerController and OVRCameraRig) take a lot of the work out of setting up the player and the controller. I do see a bunch of comparable scripts (camera raycaster stuff etc) in the Unity samples - so maybe one can get on fine without the Oculus Utilities. And you can import the VR Samples project when you create a new project now. So for me, the learning curve will be in using / modifying the native scripts… Please let me know how you get on with using the Oculus Utilities in the latest Unity build. Thanks!
UPDATE (a few hours later) - The scenes in VRSampleScenes > Scenes > Examples are great. Each one demonstrates the use of the scripts found under VRStandardAssets > Scripts in a really clear way, without any distractions. The only thing missing is a First Person character controller (eg for a bluteooth gamepad), but I think that can be pretty easily done by using a generic GameObject, mounting the MainCamera to it, and then attaching a simple script to listen for gamepad input. I will test this theory out tomorrow!
UPDATE #2 (Friday night) - So the player controller setup works fine as described above. I just need to get my mind around the fact that head orientation does not equal player orientation. So I just need to think about whether I want the player rotation to change with head orientation or use the gamepad right stick for that.
It appears the new Unity 5.2.3p3 patch that fixes a long standing VR performance bug broke the project. Under the MainCamera all the scripts show “The associated script can not be loaded. Please fix any compile errors and assign a valid script.”
@stevenbrent regarding oculus utilities it depends what you’re doing I guess! If you still find need for them, use them! Regarding first person- we don’t have a first person controller for VR in the samples for the reasons we list in the accompanying article regarding Vection.
Did you get chance to read those yet?
@everyone our CMS for the learning content is currently broken for gif playback so I wanted to apologise to those of you trying to follow them without clear examples the animated gifs provide we are hoping to get a fix shipped early next week.
@brentjohnson you appear to be discussing 5.2p3 which suggests you are opening this project in that? If so - don’t - we shipped it in 5.3 deliberately so downgrading the project may mean a loss of references which I think is what you are describing? Sorry your message was a little confusing!
Thanks, @Will-Goldstone - I’ll check out the info on Vection. (for anyone who’s looking for that: http://unity3d.com/learn/tutorials/topics/virtual-reality/movement-vr?playlist=22946 ) Even just at my novice level of experience, I can see how there is a whole new art & science of player control to learn here, very different in some ways from even traditional 3D FPS controls. It seems like doing as much as possible with head movement and gaze detection is a good idea in general. I tried out Dreadhalls from the Oculus store last night, and was definitely feeling the negative effects of independent head / virtual body movement while my physical body was stuck in the chair!
Thanks for this! As someone just getting their feet wet, this appears to be a toolset that’ll equip me to make whatever I want.
Minor Unity question: Inside the editor, I ‘checked for updates’ and it said there was nothing, but on the website it shows 5.3. Is there a reason for the delay? Do you only put it in the client when you’re sure it isn’t lighting things on fire?
I’ve been having issues parenting the Camera rig in the examples to a basic first person controller. It seems like the reticle becomes offset or just does work at all once it’s parented. My gut instinct is that the VREyeRaycaster is written for stationary camera, could be wrong as I’m not a programmer.