In my previous article I tried to explain simple gesture detection in Android using GestureDetector. In this article I will explain complex gesture detection using GestureOverlayView.
From Android 1.6 onwards includes a new package android.gesture which is used to for complex gesture recognition. This package includes APIs to store, load, draw and recognize gestures. We can define our own pre-defined patterns in our application and store these gestures in a file and later on use this file to recognize the gesture.
Gestures Builder application
There is a handy sample application, Gestures Builder, which comes with the Android 1.6 and higher. This application is pre-installed in 1.6 and higher emulators. Here is a screenshot of the application:
Using this application we can create our gesture library and save it to SD card. Once the file is created we can include this file in our application in /res/raw folder.
Loading a gesture library
To load the gesture file, we use the class GestureLibraries class. This class has functions to load from resource, SD card file or private file. GestureLibraries class has following methods:
static GestureLibrary fromFile(String path) static GestureLibrary fromFile(File path) static GestureLibrary fromPrivateFile(Context context, String name) static GestureLibrary fromRawResource(Context context, int resourceId)
All these methods return a class GestureLibrary. This class is used to read gestures entries from file, save gestures entries to file, recognize the gestures, etc. Once the GestureLibraries return a GestureLibrary class that corresponds to the file specified, we read all the gesture entries using GestureLibrary.load method.
Drawing and recognize a gesture
To draw and recognize gestures, we use the class GestureOverlayView. This view extends the FrameLayout, i.e. we can use it inside any other layout or use it as a parent layout to include other child views. This view acts as an overlay view and the user can draw gestures on it. This view uses three callback interfaces to report the actions performed, they are:
interface GestureOverlayView.OnGestureListener interface GestureOverlayView.OnGesturePerformedListener interface GestureOverlayView.OnGesturingListener
The GestureOverlayView.OnGestureListener callback interface is used to handle the gesture operations in low-level. This interface has following methods:
void onGestureStarted(GestureOverlayView overlay, MotionEvent event) void onGesture(GestureOverlayView overlay, MotionEvent event) void onGestureEnded(GestureOverlayView overlay, MotionEvent event) void onGestureCancelled(GestureOverlayView overlay, MotionEvent event)
All these methods have two parameters GestureOverlayView and MotionEvent and represent the overlay view and the event that occurred.
The GestureOverlayView.OnGesturingListener callback interface is used to find when the gesture is started and ended. The interface has following methods:
void onGesturingStarted(GestureOverlayView overlay) void onGesturingEnded(GestureOverlayView overlay)
The onGestuingStarted will be called when gesture action is started and onGesturingEnded will be called when the gesture action ended. Both these methods contain the GestureOverlayView that is used.
Most important is the GestureOverlayView.OnGesturePerformedListener interface. This interface has only one method:
void onGesturePerformed(GestureOverlayView overlay, Gesture gesture)
This method is called when the user performed the gesture and is processed by the GestureOverlayView. The first parameter is the overlay view that is used and the second is a class Gesture that represents the user performed gesture. Gesture class represents a hand drawn shape. This representation has one or more strokes; each stroke is a series of points. The GestureLibrary class uses this class to recognize gestures.
To recognize a gesture, we use GestureLibrary.recognize method. This method accepts a Gesture class. This method recognizes the gestures using internal recognizers and returns a list of predictions. The prediction is represented by Prediction class and contains two member variables name and score. The name variable represents the name of the gesture and score variable represents the score given by the gesture recognizer. This score member is used to choose the best matching prediction from the list. One of the common methods is to choose the first value that has score greater that one. Another method used is to choose the value that is inside a minimum and maximum threshold limit. Choosing this threshold limit is entirely depends on the implementation ranging from simple limits based on trial and error methods and more complex methods that may include leaning the user inputs and improve the recognition based on that.
The sample application that accompanies this article includes 5 pre-defined gestures A, B, C, D and E. When the user draws these patterns the application lists the names and scores of all the predictions.
Hope this article helps you to understand complex gesture recognition in Android.