Hi there,
I am having a really strange experience. I am trying to build a Gear VR app, that allows a user to look at a menu, and after a certain amount of time, a raycast event tells the app to enable a gameobject. In the editor, this works absolutley fine, but within the app it never seems to work. i don’t know whether it is something to do with the plugins, or whether it is my code, but I am pulling my hair out a bit here!
void cameraRayCast()
{
RaycastHit hit;
Ray ray = new Ray(transform.position, transform.forward);
Vector3 fwd = transform.TransformDirection(Vector3.forward);
if (firstTrigger.Raycast(ray, out hit, Mathf.Infinity))
{
gm.bp1 = true;
Debug.DrawLine(transform.position, transform.forward, Color.red);
}
else
{
gm.bp1 = false;
gm.bp1time = 4.0f;
}
The idea is that the user has to look at the collider for a certain amount of time for it to trigger.
In the gm script that is referenced, in the update this is called:
if (bp1)
{
bp1time = bp1time - Time.deltaTime;
timer = bp1time;
Debug.Log("bp1");
}
if (timer <= dwellTime)
{
if (bp1)
{
bp1 = false;
barPlane1.startFade = true;
Debug.Log("bp1 timer change");
Invoke("enableSphere1", timerSphere);
}
Any and all suggestions are welcome. I’ve tried in Unity 5.1, and Unity 4, and I also using the OVR controller, but I have created succesful raycast events using that in the past.
Thanks,
Tom
EDIT: Solved my problem. I had attached the raycast to the camera rig, and attaching it to one of my cameras solved the problem.