Camera gyro control

Good day everybody.
I’m in the first steps of creating an android app that uses the devices physical location and orientation in order to place the user in the correct spot and alignment in my virtual environment. What i’m currently trying to achieve is having a camera inside a spherical panorama, and the camera orientation being changed as the user moves his devices orientation, the user experience should be basically having his device as a “window” that shows the virtual environment in the physical spot where he is standing.

as a reference, i would like to achieve something like the behavior of these apps:
android - https://play.google.com/store/apps/details?id=com.bezine.panosphere&hl=en
IOS - ‎iPano on the App Store

So what i’m asking is first of all do you think it’s even possible in unity, are the gyro and compass inputs working well enough, has anyone created something similar and can offer some assistance (perhaps for a small fee?), and what troubles do you think i should anticipate in the development process.

any help will be greatly appreciated,
Many thanks and good day

Hello! I know this is kinda late, but I just found a blog post that covers the basics of getting such a camera working: link to post. It seems to work ok on my S3, but is definitely not perfect in this state.

I’m working toward a similar goal, except that I want my users to be able to walk around their position in the virtual environment to update accordingly - instead of having a series of spherical panoramas that they walk between (which it sounds like you are describing, but I may be wrong?) I just have regular 3D virtual environments in which the user can walk anywhere.

My first project was outdoors, using a MSI WindPad 110W tablet running OpenSim using a ublox MAX 6 GPS a HMC 6343 magnetometer+accelerometer. First problem with that was the accuracy of the GPS which meant that the user pretty much had to walk between spherical panoramas as the accuracy wasn’t good enough for free movement. The accuracy restricts how close any 2x spherical panoramas can be to each other how close they can be to any real world content that they relate to. The second problem was the update frequency of the gyro+acceletometer (the HMC6343 only gave me 10Hz which resulted in either very jerky movement if I didn’t smooth it or very delayed movement if I did smooth it).

My current project is indoors, using Unity running on a laptop in a bag, IndoorAtlas indoor positioning tech to track position an Oculus Rift as the screen tracker. This works much better as IndoorAtlas gives much better accuracy indoors than GPS outdoors the Rift tracker is extremely fast.

If your plan is to use GPS then I think my take home message to you is to appreciate that its accuracy will determine how close any of your panoramas can be how close they can be to real world content they relate to. Whilst modern tablet/smartphone orientation sensors are nothing compared to the Rift’s tracker they are a helluva lot better than the 10Hz HMC6343 that I used in my first project, so I think orientation won’t be much of an issue for you.

I wrote a custom plugin to do this for my app Guns AR after messing around with Vuforia and not being too happy with the results. Here’s what I did:

	private SensorManager mSensorManager;
	private Sensor mSensor;
	private float cameraData[] = new float[3];
        private float mMatrix[] = new float[16];
        private float orientation[] = new float[9];

	@Override
	public void onDestroy() {
		super.onDestroy();
		mSensorManager.unregisterListener(mySensorEventListener);
	}
	
	public void startSensor() {
		mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
		mSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_ROTATION_VECTOR);
		mSensorManager.registerListener(mySensorEventListener, mSensor, SensorManager.SENSOR_DELAY_GAME);
	}
	
	public void killSensor() {
		mSensorManager.unregisterListener(mySensorEventListener);
	}

	private SensorEventListener mySensorEventListener = new SensorEventListener() {
		
		public void onSensorChanged(SensorEvent event) {
			
			if( event.sensor.getType()== Sensor.TYPE_ROTATION_VECTOR) {
				
				cameraData = event.values.clone();
			    SensorManager.getRotationMatrixFromVector( mMatrix,cameraData );
			    SensorManager.getOrientation( mMatrix,orientation );
			    
				cameraData[0] = (float)( orientation[0] * 180.0 / 3.14 );
				cameraData[1] = (float)( orientation[1] * 180.0 / 3.14 );
				cameraData[2] = (float)( orientation[2] * 180.0 / 3.14 );
			}
		}
	
		@Override
		  public void onAccuracyChanged(Sensor sensor, int accuracy) {
		   // TODO Auto-generated method stub
		
		 }
	};
	
	public float[] getOrientation() {
		return cameraData;
	}

The cameraData array holds the rotation vector to use with your Unity camera. Just call getOrientation from your Unity script. As far as moving around goes, GPS and accelerometer just aren’t accurate enough.

Hi Guys, I’m actually coming at this as a client to a software developer so please excuse any glaring mistakes I make.

We are very close to releasing a major update to our ‘business to business’ AR App that we use for virtual prototyping. Vuforia has been great for providing a solid solution for typical 3D ‘popup’ and marker tracking and we have no issues this side at all. Part of the new update however is a unity3d scene that is essentially a camera in a sphere that is looking at a spherical jpg texture. Our problems are in the movement of this camera using the device sensors.

Our developer has been working really hard on trying to make the experience match the one they have achieved on iOS but its now a bit like you are drunk and it loses tracking very quickly.

Can you guys help? I have sent a link to the developers of this thread and asked them to engage - can you share your learnings with them to help us?

Cheers