Android: Compass Implementation - Calculating the Azimuth

With this post I want to sum up some techniques for a simple compass developed for Android. While implementing an Android app using the azimuth value to determine the directional orientation relative to a target I checked out multiple implementations. I do not want to present you a complete app but rather show you the different implementations of the sensors and data gathering algorithms. Some of them are enhanced with some sort of a low pass or queue average filter. You should also take note of the fact that the azimuth values depend on the device's alignment. So if the device is not holding flat (45° deviation) you have to use remapCoordinateSystem() in a useful way to get correct results.

Compass relevant Android Sensors

Since Gingerbread 2.3 Android supports several new sensor types. To use this Sensor in combination with the other can enable the developers to track three-dimensional device motion and orientation change with high accuracy and precision. The following table is based on the Android developer guide.

Appreciable Sensors for calculating the orientation
Sensor API Name Sensor Type
TYPE_ORIENTATION Sensor Fusion - Deprecated
TYPE_ACCELEROMETER Hardware
TYPE_MAGNETIC_FIELD Hardware
TYPE_GYROSCOPE Hardware
TYPE_GRAVITY Sensor Fusion / Hardware
TYPE_ROTATION_VECTOR Sensor Fusion / Hardware

Gather orientation data using a low-level sensors

To calculate the orientation data using the hardware sensor is straightforward. Register the sensors TYPE_ACCELEROMETER and TYPE_MAGNETIC_FIELD and get the measured data.
private int mAzimuth = 0; // degree

private SensorManager mSensorManager = null;

private Sensor mAccelerometer;
private Sensor mMagnetometer;

boolean haveAccelerometer = false;
boolean haveMagnetometer = false;

private SensorEventListener mSensorEventListener = new SensorEventListener() {
	
	float[] gData = new float[3]; // accelerometer
	float[] mData = new float[3]; // magnetometer
	float[] rMat = new float[9];
	float[] iMat = new float[9];
	float[] orientation = new float[3];
	
	public void onAccuracyChanged( Sensor sensor, int accuracy ) {}

	@Override
	public void onSensorChanged( SensorEvent event ) {
		float[] data;
		switch ( event.sensor.getType() ) {
			case Sensor.TYPE_ACCELEROMETER:
				gData = event.values.clone();
				break;
			case Sensor.TYPE_MAGNETIC_FIELD:
				mData = event.values.clone();
				break;
			default: return;
		}

		if ( SensorManager.getRotationMatrix( rMat, iMat, gData, mData ) ) {
			mAzimuth= (int) ( Math.toDegrees( SensorManager.getOrientation( rMat, orientation )[0] ) + 360 ) % 360;
		}
	}
}

@Override
protected void onCreate( Bundle savedInstanceState ) {

	this.mAccelerometer = this.mSensorManager.getDefaultSensor( Sensor.TYPE_ACCELEROMETER );
	this.haveAccelerometer = this.mSensorManager.registerListener( mSensorEventListener, this.mAccelerometer, SensorManager.SENSOR_DELAY_GAME );

	this.mMagnetometer = this.mSensorManager.getDefaultSensor( Sensor.TYPE_MAGNETIC_FIELD );
	this.haveMagnetometer = this.mSensorManager.registerListener( mSensorEventListener, this.mMagnetometer, SensorManager.SENSOR_DELAY_GAME );

	if ( haveAccelerometer && haveMagnetometer ) {
		// ready to go
	} else {
		// unregister and stop
	}

}

Enhance the low-level implementation

Because these measured values are directly from source they include some heavy measuring inaccuracies. These inaccuracies are making our azimuth value unstable and unusable so we have to improve it using some proper techniques. On the one hand we can (low-pass) filter the accelerometer values. You should determine a good alpha value for filtering since it causes some serious lag if the value is too low. On the other hand we can keep a history or rather a queue of rotation matrices and calculate the average. Furthermore we are able to extend the filter (and queue) to make them adaptable. That means while staying at position the algorithm uses a very strong filter value (alpha or queue size) for reducing the noise. If the algorithm recognises some serious movement into one direction it will lower the filter value alpha to gather faster data. Another starting point for improvement is to use the gyroscope jointly with the accelerometer and magnetic field. Because the gyroscope has a very good response time it can dramatically improve the measurement.

Gather orientation data using high-level sensors (sensor-fusion)

The technique to create software driven sensors, receiving data by getting the input of several sensors is called sensor-fusion. The measured data from sensor-fusion sensors is in many ways better than it would be possible when these sources were used individually. Better can mean, more accurate, more complete or more reliable. Android comes with some sensors which do not build upon a single hardware sensor. These included sensors are fully implemented, stable and well tested. Usually you better use sensor-fusion sensors instead of implementing your own sensor fusion algorithms.

Enhance the low-level implementation using the TYPE_GRAVITY sensor

In the previous section we used low-level hardware sensors for calculating the azimuth. With the Magnetic field and accelerometer we are able to get the orientation. Unfortunately if your phone suffers any linear acceleration, or if there are any magnetic interferences the measured values are getting noisy. As shown above the additional use of a Gyroscope would dramatically improve the response time. We also (low-pass) filtered the accelerometer values because they are not very exact. What this low-pass filter is trying to do, is to isolate the gravity component from the accelerometer. Instead of doing this you can check if the  TYPE_GRAVITY sensor is available on the device and use it instead of TYPE_ACCELEROMETER.
private int mAzimuth = 0; // degree

private SensorManager mSensorManager = null;

private Sensor mGravity;
private Sensor mAccelerometer;
private Sensor mMagnetometer;

boolean haveGravity = false;
boolean haveAccelerometer = false;
boolean haveMagnetometer = false;

private SensorEventListener mSensorEventListener = new SensorEventListener() {
	
	float[] gData = new float[3]; // gravity or accelerometer
	float[] mData = new float[3]; // magnetometer
	float[] rMat = new float[9];
	float[] iMat = new float[9];
	float[] orientation = new float[3];
	
	public void onAccuracyChanged( Sensor sensor, int accuracy ) {}

	@Override
	public void onSensorChanged( SensorEvent event ) {
		float[] data;
		switch ( event.sensor.getType() ) {
			case Sensor.TYPE_GRAVITY:
				gData = event.values.clone();
				break;
			case Sensor.TYPE_ACCELEROMETER:
				gData = event.values.clone();
				break;
			case Sensor.TYPE_MAGNETIC_FIELD:
				mData = event.values.clone();
				break;
			default: return;
		}

		if ( SensorManager.getRotationMatrix( rMat, iMat, gData, mData ) ) {
			mAzimuth= (int) ( Math.toDegrees( SensorManager.getOrientation( rMat, orientation )[0] ) + 360 ) % 360;
		}
	}
}

@Override
protected void onCreate( Bundle savedInstanceState ) {

	this.mGravity = this.mSensorManager.getDefaultSensor( Sensor.TYPE_GRAVITY );
	this.haveGravity = this.mSensorManager.registerListener( mSensorEventListener, this.mGravity, SensorManager.SENSOR_DELAY_GAME );

	this.mAccelerometer = this.mSensorManager.getDefaultSensor( Sensor.TYPE_ACCELEROMETER );
	this.haveAccelerometer = this.mSensorManager.registerListener( mSensorEventListener, this.mAccelerometer, SensorManager.SENSOR_DELAY_GAME );

	this.mMagnetometer = this.mSensorManager.getDefaultSensor( Sensor.TYPE_MAGNETIC_FIELD );
	this.haveMagnetometer = this.mSensorManager.registerListener( mSensorEventListener, this.mMagnetometer, SensorManager.SENSOR_DELAY_GAME );
	
	// if there is a gravity sensor we do not need the accelerometer
	if( this.haveGravity ) 
		this.mSensorManager.unregisterListener( this.mSensorEventListener, this.mAccelerometer );
	
	if ( ( haveGravity || haveAccelerometer ) && haveMagnetometer ) {
		// ready to go
	} else {
		// unregister and stop
	}

}

Using the high-level sensor-fusion TYPE_ROTATION_VECTOR

The TYPE_ROTATION_VECTOR is the jack of all  sensors for measuring the devices orientation information. The data are stable and we have a great response time. It uses the accelerometer, gyroscope and magnetometer if they are available. It needs to initially orient itself and then eliminate the drift that comes with the gyroscope over time. More information to the rotation vector sensor can be found in the Android developer guide.
private int mAzimuth = 0; // degree

private SensorManager mSensorManager = null;

private SensorEventListener mSensorEventListener = new SensorEventListener() {
	
	float[] orientation = new float[3];
	float[] rMat = new float[9];

	public void onAccuracyChanged( Sensor sensor, int accuracy ) {}

	@Override
	public void onSensorChanged( SensorEvent event ) {
		if( event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR ){
			// calculate th rotation matrix
			SensorManager.getRotationMatrixFromVector( rMat, event.values );
			// get the azimuth value (orientation[0]) in degree
			mAzimuth = (int) ( Math.toDegrees( SensorManager.getOrientation( rMat, orientation )[0] ) + 360 ) % 360;
		}
	}
}

@Override
protected void onCreate( Bundle savedInstanceState ) {
}

Backwards compatibility with TYPE_ORIENTATION

The TYPE_ORIENTATION sensor is deprecated since Android Froyo 1.5 (SDK 3) but still usable until Android Gingerbread 2.3 (SDK 9). It is an easy-to-use method for supporting devices with an Android Version below Gingerbread. However it just does nothing else then using TYPE_ACCELEROMETER and TYPE_MAGNETIC_FIELD with some filters like shown above. Be aware that a degree value will be returned.
private int mAzimuth = 0; // degree

private SensorManager mSensorManager = null;
private Sensor mOrientation;
private Sensor mHaveOrientation;

private SensorEventListener mSensorEventListener = new SensorEventListener() {
	
	public void onAccuracyChanged( Sensor sensor, int accuracy ) {}

	@Override
	public void onSensorChanged( SensorEvent event ) {
		if( event.sensor.getType() == Sensor.TYPE_ORIENTATION ){

			newAzimuth = ( event.values[0] + 360 ) % 360; 
		}
	}
}

@Override
protected void onCreate( Bundle savedInstanceState ) {
	// check android Version
	if ( Build.VERSION.SDK_INT >= Build.VERSION_CODES.GINGERBREAD ) {
		// enhanced compass implementation for Android 2.3+ here
	} else {
		mOrientation = mSensorManager.getDefaultSensor( Sensor.TYPE_ORIENTATION);
		mHaveOrientation = mSensorManager.registerListener( mSensorEventListener, mOrientation, SensorManager.SENSOR_DELAY_NORMAL );
	}
}

Video Ressources

There is a video available from Google Tech Talks 2010. It introduces the mechanism of several sensors and is very helpful for getting a deeper understanding of how some of the sensors are working.