I have ordered and received the Samsung Gear VR and have setup my dev environment and am able to create a project and deploy to the Gear VR for testing.
One thing I am stuck on at the moment and have not been able to find any simple tutorial or posts is how to detect a tap or swipe on the Gear VR touch pad.
I’ve tried using…
if (Input.touchCount > 0 ) {
// Do something
}
Within the Update method, but nothing at all seams to be detected.
Can anyone tell me how it should be done for Samsung Gear VR ?
I found the solution with help from “MikeF”…Big thanks mate…So for those coming after me, here is the solution.
You need to correctly map the input before you can code against it in Unity. Go to Edit->Project Settings->Input. This will open the Input Mapping inspector. Make sure you have a Mapping that looks like this…(I named mine “Tap”)…
Hi Rameshp, I’m a new user too, so my advice might not be perfect but here goes…You should create a new JS component and paste this code into it. Once that is done, drag and drop that JS code file onto your game object. Your game object, every time it fires the Update’ function, will check for the tap and tap up events. Hope that helps.
Ok so let me see I got this. I have a simple NGUI panel with a button. So do I drag and drop the above script file onto the panel or the button ?. I want the user to be able to tap the button on the panel using the touchpad.
Hmnn…that’s outside my experience at this point. Although my suggestion should detect a tap start and end, it will do it scene wide. In other words, I believe that if you attach that code to the button, then it won’t matter where the user is looking or which button is currently highlighted, as soon as the tap is detected the button with the code attached will detect a tap even if the user is looking at another button, or another button is highlighted. You’ll need someone with NGUI experience to solve this one I’m afraid.
I have read a little about RayCasting…and I think it MIGHT be possible to determine if the button is in the center of the direction that the main camera is looking. In other words, when the attached script detects a tap, you could then include some code in the same function to check if the user is looking directly at the button or not. If they are, then you perform whatever function you want the button to do. Sorry I can’t be much more help…but it might be worth checking out raycasting.
I was finally able to use Raycasting to select a single button by using the user’s gaze. It was there in the Unity Integration SDK examples all along. you just need to refactor it to suit your needs. A snippet would look like below.
Ray ray = new Ray();
if(cameraController.GetCameraRay(ref ray)) {
// in the editor use the mouse
/*
Camera currentCamera = null;
cameraController.GetCamera( ref currentCamera );
ray = currentCamera.ScreenPointToRay( Input.mousePosition );
*/
//drop the 'VrButton' script onto a button gameobject
VrButton lastActiveButton = activeButton;
activeButton = null;
RaycastHit hit = new RaycastHit();
for (int i = 0; i < buttons.Length; i++)
{
if (buttons[i].collider.Raycast(ray, out hit, 100.0f))
{
activeButton = buttons[i];
if (activeButton != lastActiveButton)
{
Debug.Log("You're Looking at the button : " + activeButton.commandId);
PlaySound(vrbuttonSelectSound);
PlayAnim("Button");
// activeButton.GetComponent<VrButton>().defaultColor = Color.black;
}
break;
}
}
revert back for any issues or if you need more clarifications. And thanks again for the great pointer btw
The touch pad on the Gear VR seems to mapped up as a mouse device.
This means that you can use normal mouse input code it.
Input.GetMouseButton(0) will return true if you have a finger on the touchpad and otherwise false.
Input.GetAxis also works for the Gear Vr touchpad. It correctly maps its x- and y-axis to the Mouse X and Mouse Y axis.
With this input, it’s quite easy to write your own implementation for the simple gestures (Tap, doubletap, Swipe in different direction)
Input.touches does not work per default. Im not aware if this can be fixed with some configuraion.
Thanks for your answer. I also found that part of the script in the SDK
Can you give a full example of how to use the Raycasting script on a game object? Would be great. Furthermore I cannot find the VrButton script. Would also be great if you can share this!
// on start
OVRTouchpad.Create();
OVRTouchpad.TouchHandler += HandleTouchHandler;
void HandleTouchHandler (object sender, System.EventArgs e)
{
OVRTouchpad.TouchArgs touchArgs = (OVRTouchpad.TouchArgs)e;
if(touchArgs.TouchType == OVRTouchpad.TouchEvent.SingleTap)
{
//TODO: Insert code here to handle a single tap. Note that there are other TouchTypes you can check for like directional swipes, but double tap is not currently implemented I believe.
}
}
I cant figure out why nothing happens in the game when i tap on the gear vr.
I also tried other options such as suggested on the post above without results…
I know it is a dumb thing, anyone to show me the way?
I pretty much followed the youtube tutorial by everydayvr and runs Unity5.1.2f1.
using UnityEngine;
using System.Collections;
public class MouseTap : MonoBehaviour {
void Update() {
if (Input.GetMouseButton(0) || Input.GetButtonDown("Fire1") || Input.GetKeyDown(KeyCode.Space))
Application.LoadLevel("01");}
}
I ended up not following the tutorial, not importing the sdk mobile in my unity project and not doing what is mentioned on the forums, and it worked! I still dont get it, but it works.
I’m now on Unity5.2.2f1
Basically, dont import any sdk for mobile, just copy your Oculus signature file in your android/asset folder once you switched platform to export on Android in the ‘build settings’ window, and build. It should work. Let me know how it goes!
and Happy holidays!