Recently, we were tasked with upgrading the interface of a touch-based mobile application.  The users’ main complaint was that they had to wear heavy gloves and were not able to touch the screen to interact with it.  The client asked us to add motion gesture recognition to the user interface.  They wanted us to add recognition for shaking or tilting the phone left, right, up, and down.  We would come to know that accelerometer-based gesture recognition does not have a simple solution.

Android smartphones have a variety of possible sensors on board including accelerometers, pressure sensors, temperature gauges, gyroscopes, and magnetometers.  The ideal phone for gesture recognition would have gyroscope and accelerometer sensors, but unlike the iPhone, few Android phones have the gyroscope sensor.  However, most Android phones do have a 3D accelerometer, making it the best choice for broad device support.

Using the accelerometer is commonplace in existing apps for detecting the phone orientation.  Many apps will change from portrait to landscape mode when the phone is rotated 90 degrees.  Since gravity is applying a constant acceleration toward the Earth, the Android platform can determine which way is down based on the accelerometer.  There are several apps/games that use this information to determine how the phone is tilted.  One of the more popular games that uses tilt detection is My Paper Airplane.

In addition to mobile devices, the Wii and PS3 use accelerometer sensors in their controllers.  The PS3 has several racing and flying games that use the tilt of the controller to perform motions such as steer left or right.  The Wii uses a combination of an accelerometer and an IR camera to control most actions.  The Harry Potter games on the Wii have accelerometer-based gesture recognition to cast spells, for example.

Armed with the knowledge that accelerometer-based gesture recognition is possible, the first step in gesture recognition on mobile devices is gathering the data from the sensor.  Android Cookbook has an excellent starting tutorial of how to gather the accelerometer data to detect shaking of the phone.  The process basically involves setting up an event listener for the accelerometer, analyzing the changes during events, and performing an action based on the analysis.  The shake example can be extended to determine the amount of time between shakes and discarding the gesture if too much time has elapsed.

With shake detection in place, we just needed to determine how to detect tilting the phone left, right, up, or down.  As I mentioned before, there are several games that use tilt-detection.  Tilt-detection is as simple as detecting which accelerometer axis is experiencing the most force from gravity.  However, a game has the advantage of only using tilt-detection after the user has started a level, but a user interface must constantly detect tilt.  The constant detection is a problem when the user puts the phone at his or her side to walk or run and leads to unintended interface actions.  Also, the human wrist does not rotate the phone to tilt perfectly along the phone x and y axes.  Since simple tilt detection does not work well for a user interface, a more powerful solution is required.

Gesture recognition can be as simple as detecting change exceeding a threshold, such as in the shake example, but a more robust solution involves using pattern recognition algorithms.  A few examples of how pattern recognition have been used in computer science includes developing artificial intelligence, recognizing speech, or analyzing network traffic.  We determined that the best algorithms for our purpose would either involve FastDTW or Hidden Markov Models.  Developing a robust gesture recognition platform using these algorithms would exceed our budget.  Luckily, we found WiiGee. WiiGee is an open source java gesture recognition library for WiiMotes using Hidden Markov Models.

WiiGee has an android plugin that helps integrate the gesture recognition library with android devices.  Integration of the WiiGee library was not easy, as there is limited documentation, but it was worth it.  After the library was integrated, we had to train the tilting gestures for WiiGee to recognize.  We found that gestures with longer durations were more easily recognized than quick gestures.  When WiiGee recognizes a gesture, it also states the percentage likelihood that the gesture matches the trained gesture data.  By only accepting gestures recognized with over 99% probability, we filtered out the false positives and had a good gesture recognition system for user interfaces.

Our 3D accelerometer-based gesture recognition system for android devices evolved from detecting changes in acceleration to using the WiiGee library and Hidden Markov Models to match patterns in acceleration changes.  As more android devices come with the gyroscope sensor, adding gyroscope data to the gesture models would make the gesture recognition even more accurate.

Like this post? Please share it! Then follow us on Twitter – @thorntech – and sign up for our email list below for future updates.

[grwebform url=”″ css=”on” center=”off” center_margin=”200″/]