Cardboard VR button

How do you interact with the cardboard button?
How do you change the scene in vr to cardboard?

Hey

Re. button, it depends on Cardboard version. If it’s a physical touch-the-screen type (later Cardboard), it registers as a touch event so Input considers it a mouse click. Earlier versions need a magnet sensing version of the Cardboard library or you can retrofit it. I can help you out linking if it’s not easy to Google.

Re. your second question, not sure what you mean? Maybe set Cardboard as your first VR type then turn on VR? I won’t go further in case that’s not your question. Please be expand a bit?

HTH

1 Like

There are a few points to consider when it comes to supporting Cardboard 1.0 magnetic buttons:

  • Google officially removed support for them after deciding they caused too many problems (more on this in a moment), so if you want to support them, you’re going to have to implement it yourself.
  • The magnetometer is an extremely “noisy” sensor. On my Nexus 6P with Cardboard1.0-like headset, pressing the button causes the x and y values in the Vector3 returned by Input.compass.rawVector to increase by ~5-10 units. The catch is, rawVector’s values NORMALLY seem to vary by approximatey +/- 0.5 to 2 ANYWAY.
  • At 60fps, with readings in every Update(), you have to sample it over a period of several frames. You might be able to detect a “user presses button, then allows it to quickly snap back” within 3 frames. Confidently detecting a slower (but still rapid) press takes ~7-9 frames. Anything less, and the “change-due-to-moving-the-magnet” will be less than the “moment-to-moment random variation”.
  • At some level, you have to consider whether the user is moving the phone. Entire books have been written about making sense of gyro and accelerometer data. TL/DR: it’s WAY harder than reading a single magic value. Neither is trustworthy alone, and they’ll collectively drift over time without a third reference point. Catch-22: the third reference point is usually the magnetometer, which the magnet constantly throws off.
  • Many/most lower-end Android phones (the ones you and I wouldn’t touch with a dirty pole, but “normal” users buy in blister packs at Walmart for $49.99) lack real gyroscopes… they use the magnetometer as the accelerometer’s second reference point. I think you might see the problem here… on THESE phones, using the button will SERIOUSLY disrupt the VR camera’s ability to track head movements.
  • It doesn’t personally affect MY phone (Nexus 6P) and ROM ( LineageOS 15), but I’ve read that there are SOME phones/ROMs with discrete gyro that aggressively use SensorFusion and are almost as badly-affected by magnetic-field disruptions as phones without discrete gyros.
  • From what I’ve read, some phones work poorly/not-at-all with the magnet on the left. Others work poorly/not-at-all with the magnet on the right. I can’t name any specific ones, but I suppose it’s non-inconceivable that some/many/most phones might display magnetometer readings that deviate from those observed on my Nexus 6P (either showing up on axes besides Input.gyro.rawVector’s X & Y, having different signs, different magnitudes, or different relative magnitudes… ie, instead of X & Y increasing by approximately the same amount, one might increase by half as much).

Specific problems I encountered while getting it to work for my own project:

  • I tried https://github.com/CaseyB/UnityCardboardTrigger , but couldn’t get it to detect button-presses with my phone & headset. My hunch is that his code probably works perfectly on HIS phone & headset, but MY phone & headset produces values that deviate by too much from the ones HE observed when getting it to work on HIS phone & headset. From what I’ve read, the Nexus 6P was ALWAYS kind of flaky when it came to detecting magnetic button presses on a Cardboard1.0 headset using Google’s own library, so this is likely to be the case.
  • This implies that a general-purpose algorithm that supports random combinations of phones & headsets is likely to be VERY hard to achieve… and damn-near IMPOSSIBLE to achieve in a way that doesn’t require fairly tedious initial setup and calibration by the user to detect how the user’s specific phone+headset combo is detected. I spent three days grappling with the problem, attempting to come to terms with extended Kalman filters, and generally going in circles before deciding that I was just spending way too much time trying to solve a REALLY hard problem whose difficulty (esp. supporting phones/headsets that deviate from my own that I can observe directly & find values for empirically) largely outweighs its benefits.

Long story short… if you have a VERY solid grasp of linear algebra & digital signal processing, already know how to properly implement Kalman filters, and have a good understanding of how the magnetometer, gyro, and accelerometer on Android devices work… you MIGHT be able to cobble together code that works on YOUR phone in a few hours, works on MANY phones in a day or so, and has an entire user-friendly calibration UI and workflow in about a week. Otherwise, you’re going to beat yourself up for days, then end up settling for something that seems to work “ok” on your phone & you know almost beyond doubt won’t work on anybody else’s gear.

I’ll probably post my code to GitHub eventually once I get it cleaned up a little better, but in the meantime I’ll post what I have & some quick directions for using it in the next post.

Here’s the code I’m currently using to read the magnetic trigger on my RIYU2 headset with Nexus 6P (LineageOS 15).

Known issues:

  • My RIYU2 headset’s magnet is on the left, and moving it down causes the X and Y components of the Vector3 returned by Input.gyro.rawVector to increase by approximately 5-8 units, in approximately equal amounts.
  • At the moment, there’s no logic to sanity-check the presence of a gyro or calibrate it for other devices. If you have a Nexus 6P and a Cardboard1.0-type headset with magnetic trigger on the left side & your phone is held in landscape orientation with the portrait-top facing left and portrait-bottom facing right, it might work. Any other combination of hardware will probably require tweaking the threshold values, and might not work at all.
  • Do NOT assume that a call to onMagneticTriggerPress() will eventually be followed by a call to onMagneticTriggerRelease() before the next call to onMagneticTriggerPress(), or vice-versa. Magnetic triggering is temperamental. Sometimes, it just won’t register & will have to be repeated. A better algorithm might help, but that’s beyond my time and abilities at the moment.
  • The logic is REASONABLY resistant to false-triggering by diagonal head motion between the lower-left and upper-right… but it’s not 100%. If you’re using the trigger to fire off some activity with serious consequences, make it a two-step confirmation so a single spurious click can’t trigger an unwanted action with lasting consequences.
  • The triggering or releasing motion has to complete within 10 frames, and must be gentle enough to avoid registering as phone motion. That’s why the example program first turns the sphere cyan… a color never used thereafter. If you launch the sample script, see the cyan sphere, and turn it green by pressing down, then releasing the magnet and allowing it to quickly snap back, it means you’re either pressing the magnet down too slowly, or you’re causing too much vibration & causing it to be ignored.
  • By extension, if you trigger a reading and want to get the magnet back into the right position to be consistent with it, moving the magnet SLOWLY shouldn’t trigger a press or release event.

Ideas for improvement:

  • Along with magnetometer rawVector readings, log gyro and accelerometer readings in parallel ring buffers. When a potential down/release event is detected, calculate the present attitude and the attitude during the frame where the button event first began & compare them.
  • Implement a better algorithm, like an unscented Kalman filter. Probably way beyond what I can do right now in any reasonable amount of time, but something I might revisit someday if I still care about solving the problem & have the benefit of new knowledge gained in the meantime.
  • Attempt to detect finger-taps on the edge or top using the accelerometer and gyro alone, without using the magnetic trigger. My hunch is that most Android devices have accelerometers that are too slow to detect the impulse, have kernel drivers that won’t allow you to sample quickly enough, or depend on API calls that mangle the event data too much. Otherwise, I suspect Google would have gone with the “tap the side of the cardboard viewer & detect the shock” approach in the first place, because it seems like such an obvious alternative.

How to use:

  • Attach MagneticClickKludger.cs to a scene object
  • Look at MainScript.cs to see how I used it. MainScript itself is given a reference to a GameObject in the scene (a 3D sphere, in my case) & changes its primary color depending upon whether the button is pressed, released, or appears to be in some anomalous error state that will probably require more work on your part to resolve.

3850411–325675–MagneticClickKludger.cs (2.72 KB)
3850411–325678–MainScript.cs (1.76 KB)

Here’s an updated version of the class formerly known as MagneticClickKludger (now “GcbTrigger”) that adds support for Cardboard2.0-type capacitive buttons and gamepad buttons (mouse button in GvrEditorEmulator is a freebie bonus of supporting gamepad buttons, see Javadoc-like comment for GcbTrigger.analyzeGamepad()).

Note that I haven’t rewritten MainScript.cs to account for the name change, so you’ll have to either rename GcbTrigger back to MagneticClickKludger, or update MainScript.cs to refer to GcbTrigger instead.

Other additions: added public Booleans to allow you to disable support for magnetic button, gamepad buttons, and/or capacitive buttons, as well as a String field to let you set the name you’ve assigned to the button inputs in InputManager (see Javadoc-like comment in class for more details).

3852667–326029–GcbTrigger.cs (5.14 KB)