Creating AppWidget in Android, part 1

Download the source code of this article.

AppWidgets are small views or widgets that can be embedded in another application. The containing application is called App Widget Host. One example of host application is the Home Screen application. When you long-press the home screen and selecting Widgets option will present you a list of widgets available. This is a two part series on how to create App Widgets. In this part I will stick on to the basics of creating the widget. In the second part I will explain how to provide more advanced topics.

We can create our own App Widgets by extending AppWidgetProvider class, which is actually a BroadcastReceiver with action android.appwidget.action.APPWIDGET_UPDATE. To create an App Widget we extends this class and override the onUpdate method. Then we have to specify the App Widget in AndroidManifest.xml file using the <receiver> tag. Finally we have to describe the App Widget in using AppWidgetProviderInfo object in an XML file.

Creating AppWidget

To create and App Widget, we create a class extending AppWidgetProvider. The AppWidgetProvider class has following methods:

void onEnabled(Context context)
void onDisabled(Context context)
void onDeleted(Context context, int[] appWidgetIds)
void onUpdate(Context context,
	AppWidgetManager appWidgetManager, int[] appWidgetIds)
void onReceive(Context context, Intent intent)

The method onEnabled is called when an instance is create for the first time, i.e. when the user add the widget for the first time. It will not be called for subsequent additions. In this method we can perform any startup tasks.

The method onDisabled is called when the last instance of the widget is deleted. This method is to do some cleanup tasks.

The onDeleted is called when each instance of the widget is deleted.

The onUpdate method is first called when the widget is added to the host. After that this method will be called after the specified time intervals. This is the most important method of the App Widget. In this method we update the view with the data we want to display. If the data is readily available, we can display it in this method itself, if the data is not available and need to be fetched then it is better to create a Service application which actually fetch the data and update the widget. This is to avoid the ANR (Application Not Responding) error. We can update the widgets using AppWidgetManager’s updateAppWidget() method.

Declaring the App Widget

We have to declare the App Widget in the AndroidManifest.xml file using the <receiver> tag. As I told you before the AppWidgetProvider is actually is a BroadcastReceiver. Anything that can be done using this class can also be done using the BroadcastReceiver. Following snippet specifies the GPS App Widget:

<receiver android:name=".GPSWidgetProvider">
	<intent-filter>
		<action android:name="android.appwidget.action.APPWIDGET_UPDATE" />
	</intent-filter>
	<meta-data
		android:name="android.appwidget.provider"
		android:resource="@xml/gpswidgetinfo" />
</receiver>

Everything is same as specifying BroadcastReceiver except the meta-data section, here we describe our App Widget in a separate XML file.

Describing the App Widget

App Widget description is specified in a separate XML file. In this XML file we specify the minimum width and height of the widget, update period, widget layout, configuration screen layout, etc. Following snippet describes the GPS App Widget:

<appwidget-provider xmlns:android="http://schemas.android.com/apk/res/android"
    android:minWidth="294dp"
    android:minHeight="72dp"
    android:updatePeriodMillis="900000"
    android:initialLayout="@layout/gpswidget">
<appwidget-provider>

The minWidth specifies the minimum width the widget, minHeight specifies the minimum height of the widget. The updatePeriodMillis specifies the update period in milliseconds. The initialLayout specifies the widget’s layout.

Sample application

The sample application provided is a widget that displays the GPS coordinates the device. The GPS coordinates are reverse geo-coded to get the name of location and will be displayed if it is available. To reverse geo-code the location I used the class Geocoder. The widget will be updated in every 15 minutes.

Happy App Widget coding Smile

A bare minimum web server for android platform

Download the source code of this article.

Since today’s mobile phone are having more computing power, running a web server on mobile phone can do lot more. There are lots of  Android Web Server applications out there that you can try. But I decided to create one my own.  We can create HTTP Server using org.apache.http packages, the only thing is that we could not run the server with default port, i.e. port 80. I think this is a Linux restriction and noting to do with Android platform.

To create a typical Web Server, first you want to create a server socket and listen to the desired port. Then accept the connection and finally process the HTTP request and send the response to the client.

To create a server socket we use the class java.net.ServerSocket. The constructor of this class accepts a port number to listen for the incoming connections. Once the ServerSocket object is created, we accept the incoming connection using ServerSocket.accept() method. This method is blocking so we create a separate thread to accept and process the incoming connections. The accept() method returns a java.net.Socket object which represents the accepted connection. Once the connection established, next we want to process the HTTP request. This can be done using org.apache.http.protocol.HttpService class. This class provides a minimal HTTP processor server side implementation. Following is the HttpService constructor:

HttpService(
	HttpProcessor proc,
	ConnectionReuseStrategy connStrategy,
	HttpResponseFactory responseFactory)

This class is from the Apache implementation for Android. Explanation of Apache implementation is out of scope of this article. For more information about this please follow this link.

The first parameter org.apache.http.protocol.HttpProcessor is an interface that is used to process the requests and response. The class org.apache.http.protocol.BasicHttpProcessor is a default implementation of this interface.

The second parameter org.apache.http.ConnectionReuseStrategy determines the connection reuse strategy. There two different implementation of this interface; org.apache.http.impl.DefaultConnectionReuseStrategy and org.apache.http.impl.NoConnectionReuseStrategy. The first one re-uses the connection and the second one never re-use the connection. In normal cases we use the first one.

The HttpService class relies on the last parameter to process and send the HTTP response. org.apache.http.HttpResponseFactory is a factory interface that process and send the response to the client. The class org.apache.http.impl.DefaultHttpResponseFactory is a default implementation of this class.

To handle different HTTP requests, we use a handler map based on the URI pattern. For example for all the URIs which starts with /message will be handled by a class and URIs which starts with /dir will be handled by another class. For this purpose we use org.apache.http.protocol.HttpRequestHandlerRegistry class to create a map based on the URI pattern. This class implements org.apache.http.protocol.HttpRequestHandlerResolver which handles different request handlers. Using org.apache.http.protocol.HttpRequestHandlerRegistry we can register different URIs patterns and corresponding org.apache.http.protocol.HttpRequestHandler to handle the requests. HttpRequestHandler is a class that handles the HTTP request. We can create different HttpRequestHandler classes to handle HTTP requests that matches a particular pattern. To register a URI pattern we use the HttpRequestHandlerRegistry.register() method. The syntax of this method is:

public void register (String pattern, HttpRequestHandler handler)

The first parameter is the URI pattern. This can be one of the following:

  • * – handles all requests
  • *<uri> – handles all requests that has <uri> at the end
  • <uri>* – handles all requests that starts with <uri>

The HttpService class will determine which handler should use based on the URI that is received. To handle the HTTP requests we use HttpService.handleRequest() method. The syntax of the method is:

public void handleRequest (HttpServerConnection conn, HttpContext context)

The first parameter is org.apache.http.HttpServerConnection interface for use on the server side. We can us the org.apache.http.impl.DefaultHttpServerConnection class for this purpose which can be created by passing the Socket we received. Second parameter is an org.apache.http.protocol.HttpContext interface, again the org.apache.http.protocol.DefaultedHttpContext implementation can be used.

Sample Application

The sample application is a bare minimum web server. The server provided two functionality; send a message to the device and folder/file listing. To access the server we should know the IP address of the device, then we can use the URL http://<deviceip>:<port>. This will present the home page. This one provides you a very minimum functionality. I will be adding more functionality in future and will post the changes here.

You know, it’s very hard to explain a complicated subject in a short article like this. But I hope this gives an introduction to the web server implementation in Android.

Android audio recording, part 2

Download source code of this article.

My previous article was about recording audio using MediaRecorder class. This class is very easy to use and can be used for quick and simple audio recording, also this class saves the audio data either MPEG 4 or 3GPP compressed format. But if you need raw data that can be used to process audio data, then you cannot use MediaRecorder class but you have to use AudioRecord class. AudioRecord class provides you the raw data in uncompressed format. You can use this data to write to a file, display waveform, etc..

To record audio using AudioRecord class:

  1. Create an instance of AudioRecord
  2. Start recording using startRecording() method
  3. Read uncompressed data using AudioRecord.read() method
  4. Stop recording using stop() method.

Creating AudioRecord instance

The constructor of AudioRecord class is:

AudioRecord(
	int audioSource, int sampleRateInHz,
	int channelConfig, int audioFormat, int bufferSizeInBytes)

The first parameter is the audio source, this can be one of the AudioSource values. Second parameter is the sample rate in Hz. This can be 44100, 22050, 11025, 8000, etc.. Third parameter is channel value, this can be one of the AudioFormat values (normally CHANNEL_IN_MONO or CHANNEL_IN_STEREO). Third is the audio encoder format, this can be one of the following values:

  1. ENCODING_PCM_16BIT – 16 bit per sample
  2. ENCODING_PCM_8BIT – 8 bit per sample
  3. ENCODING_DEFAUL – default encoding

Forth parameter is the buffer size. This should be calculated using AudioRecord.getMinBufferSize() static method. The syntax of getMinBufferSize() method is:

public static int getMinBufferSize (
	int sampleRateInHz, int channelConfig, int audioFormat)

The parameters are same as that of the constructor. The method calculate the minimum buffer size needed to store one chunk of audio. Anything less than this number will result in failure while creating the AudioRecord object.

Start Recording

Once we create an instance of the AudioRecord object, startRecording() method is used to start recording audio.

Reading uncompressed data

Once the recording is started, it is the responsibility of the application to read the audio data and store it for further processing. We can read the audio data using one of the read method of AudioRecord class. The audio data will be raw PCM_16BIT or PCM_8BIT format depending on how you initialized the object. The application can use this data to do further processing of any sort. In our sample application we just save this data to a temporary file, further when the recording is stopped we read this data and write to WAV file. The application has to continuously read the data otherwise the previous chunk of data will be overwritten by the new one.

Stop recording

Recording can be stopped using stop() method. Remember to call release() method to release the AudioRecord object.

The following code snippet shows the typical usage:

AudioRecord recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,
	44100, AudioFormat.CHANNEL_IN_STEREO,
	AudioFormat.ENCODING_PCM_16BIT, bufferSize);

recorder.startRecording();
.
.
.
recorder.stop();
recorder.release();

Sample application

The sample application of this article records audio and save it to a WAV file. The WAV file will be placed in “/SDCard/AudioRecorder” folder with current millisecond as the file name.

Reference

Writing WAV file – RingDroid sample application, RehearsalAssist application

Android audio recording, part 1

Download source code of this article.

In Android we can record audio in two different ways; using MediaRecorder class and using AudioRecord class. Using MediaRecorder class is very easy but gives you less flexibility. AudioRecord class provides you more flexibility but is little bit complex. This article is about recording audio using MediaRecorder, I will explain the AudioRecord class in another article.

Using MediaRecorder class we can record the audio in two different formats, MediaRecorder.OutputFormat.THREE_GPP or MediaRecorder.OutputFormat.MPEG_4. In both cases the encoder should be MediaRecorder.AudioEncoder.AMR_NB. Till version 2.2, Android does not support other encoders. This link gives the details of media formats supported in Android.

To record an audio using MediaRecorder:

  1. Create an instance of MediaRecorder
  2. Set the audio source using setAudioSource() method
  3. Set output format using setOutputFormat() method
  4. Set Audio encoder using setAudioEncoder() method
  5. Set output file path using setOutputPath() method
  6. prepare using prepare() method
  7. start recording using start() method

The following code snippet starts audio recording:

recorder = new MediaRecorder();

recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
recorder.setOutputFile("/sdcard/sample.3gp");

recorder.setOnErrorListener(errorListener);
recorder.setOnInfoListener(infoListener);

try {
	recorder.prepare();
	recorder.start();
} catch (IllegalStateException e) {
	e.printStackTrace();
} catch (IOException e) {
	e.printStackTrace();
}

And following code snippet stops recording:

recorder.stop();
recorder.reset();
recorder.release();

recorder = null;

Note that we have to reset and release the MediaRecorder object because it is a limited resource in Android, if it is not released other applications using MediaRecorder may not get this resource.

MediaRecorder class uses event listener interface to report any errors or warning occurred during recording session. There are two event listeners MediaRecorder.OnErrorListener and MediaRecorder.OnInfoListener. The MediaRecorder.OnErrorListener is used to report any errors during recording. It has the following method:

public void onError(MediaRecorder mr, int what, int extra)

This method is called when an error occurred, the first parameter is the MediaRecorder object, second one is the error code and third one is the extra information regarding the error occurred.

MediaRecorder.OnInfoListener is used to report any warning occurred. It has the following method:

public void onInfo(MediaRecorder mr, int what, int extra)

This method is called when any warning occurred during recording, the first parameter is the MediaRecorder object, second one is the warning code and third one is the extra information about the warning occurred.

The Audio Source can be one of the following values:

  1. MediaRecorder.AudioSource.DEFAULT – default source usually MIC
  2. MediaRecorder.AudioSource.MIC – Microphone
  3. MediaRecorder.AudioSource.VOICE_CALL – Voice call uplink and downlink source
  4. MediaRecorder.AudioSource.VOICE_DOWNLINK – Voice call downlink source
  5. MediaRecorder.AudioSource.VOICE_UPLINK – Voice call uplink source
  6. MediaRecorder.AudioSource.VOICE_RECOGNITION – Usually DEFAULT source

Even though we can set any one the above Audio Source, from my experience only MediaRecorder.AudioSource.MIC is working in my Nexus One with Android 2.2, all others are not working. If anyone can record using other sources please comment here.

Sample Application

The sample application is a simple one that records audio from MIC and stores the file in “/SDCard/AudioRecorder” folder with current milliseconds as the filename. A sample screen shot is displayed below:

Audio Recorder main screen Audio Recorder choose format screen

Hope this is piece of information is useful to you all.

Creating Android live wallpaper

Download source code of this article.

As you all know live wallpaper is introduced in Android from version 2.1 onwards. Live wallpapers are animated backgrounds for your home screen. You can set your live wallpaper by long pressing the home screen and selecting Wallpapers->Live Wallpaper menu item. A list of live wallpapers installed in the system will be shown and you can select one of them to make it active. Live Wallpaper is like a normal application Android and can use any feature that normal application uses like MapView, Accelerometer, GPS, etc.

Live Wallpaper is like a normal service application in Android. It has two components, WallpaperService and WallpaperService.Engine. The service component is like normal service application but with extra method onCreateEngine(). The purpose of this method is to return a instance of Wallpaper.Engine class which actually do all the tasks including drawing wallpaper and managing the lifetime. The WallpaperService will call the onCreateEngine() method when a wallpaper is active. A wallpaper can provide a settings screen if needed. Settings screen can be used to configure the wallpaper (in our sample application we provide this settings screen to select the pattern to draw on the wallpaper screen, see the sample application section).

To create our own wallpaper we create the WallpaperService and override the onCreateEngine method. Then specify the wallpaper in AndroidManifest.xml file. Following is sample manifest entry:

<application android:label="@string/app_name" android:icon="@drawable/icon">
	<service android:label="@string/wallpaper_pattern" android:permission="android.permission.BIND_WALLPAPER" android:name="com.varma.samples.patternwallpaper.PatternWallpaper">
        	<intent-filter>
                	<action android:name="android.service.wallpaper.WallpaperService" />
		</intent-filter>
            	<meta-data android:name="android.service.wallpaper" android:resource="@xml/patternwallpaper" />
        </service>

	<activity android:label="@string/wallpaper_settings" android:name="com.varma.samples.patternwallpaper.PatternWallpaperSettings" android:exported="true"/>

</application>

The <service> specifies our wallpaper service. The <intnet-filter> should be android.service.wallpaper.WallpaperService. We should create an XML resource that specify our wallpaper service and give it in <meta-data> tag to describe the wallpaper. Finally we have to specify the settings activity if there is any. In addition to this, we have to specify the <users-sdk> and <uses-feature> tags, the <users-sdk> tag should specify 7 or above since the feature is only available 7 or above. A sample entry is:

<uses-sdk android:minSdkVersion="7" />
<uses-feature android:name="android.software.live_wallpaper" />

Once we specify these entries we create a service and extend from WallpaperService class and override the onCreateEngine method. From this method we return our own instance of WallpaperService.Engine implementation. The system will provide a SurfaceView class for drawing purpose. The WallpaperService.Engine has following important method which we have to override to implement the wallpaper engine:

public void onCreate(SurfaceHolder surfaceHolder) Called while creating the Wallpaper.Engine object. Note that the system provides a SurfaceHolder implementation
public void onDestroy() Called while destroying the object
public void onOffsetsChanged(

float xOffset,

float yOffset,

float xOffsetStep,

float yOffsetStep,

int xPixelOffset,

int yPixelOffset)

Called when offsets changed.
public void onSurfaceChanged(

SurfaceHolder holder,

int format,

int width,

int height)

Called when the SurfaceHolder’s surface is changed.
public void onSurfaceCreated(SurfaceHolder holder) Called when the SurfaceHolder’s surface is created.
public void onSurfaceDestroyed(SurfaceHolder holder) Called when the SurfaceHolder is destroyed.
public void onVisibilityChanged(boolean visible) Called when the visibility of the wallpaper changed. The parameter visible is true when visible otherwise false. Here we have to start and stop drawing depending on the visibility.
public void onTouchEvent (MotionEvent event) Called when touch events occurred. There are different events, some of them are:

android.wallpaper.tap

when the user taps the home screen.

android.home.drop

when the user drops an icon on home screen.

Our sample application does not use onTouchEvent but a live wallpaper can use this to implement different effects.

Normally while creating wallpaper we override these method and start ad stop the drawing in onVisibilityChanged. For drawing the wallpaper we create separate thread or a runnable routine. The following code snippet creates a runnable routine:

private final Runnable drawrunnable = new Runnable() {
	public void run() {
		draw();
	}
};


 

The draw method actually draws the wallpaper. In this method we post the drawrunnable routine every 5 seconds using a Handler class. The draw method implementation is (from sample application):

private void draw(){
	final SurfaceHolder holder = getSurfaceHolder();
	Canvas canvas = null;

	handler.removeCallbacks(drawrunnable);

	try
	{
		canvas = holder.lockCanvas();

		drawPattern(canvas);
	}
	finally
	{
		if (canvas != null)
			holder.unlockCanvasAndPost(canvas);
	}

	if(isVisible)
	{
		handler.postDelayed(drawrunnable,DRAW_DELAY);
	}
}

Sample Application

The sample application of this article is a simple live wallpaper that draws three different pattern; spirograph, circle pattern and sierpinski triangle. Use can choose the pattern to draw. The pattern is drawn with random values in every 5 seconds.

Following are some of the screenshots of the application:

spirograph circlepattern sierpinski

Hope this article is helpful.

Happy Live Wallpaper coding! Smile

Face Detection in Android

Download the source code of this article.

Recently a friend of mine emailed me and asked about how to do face detection in Android. I created a sample application for him, and I think it is better to share it with you all.

Android provides a convenient class FaceDetector for detecting faces in a bitmap object. This class gives you an array of Face objects that represents the faces found in the given bitmap. The Faces class contains the information about each face detected such as distance between the eyes, mid-point between the eyes and the pose (the rotation around X, Y and Z axis) of the face.

First we initialize the FaceDetector class by passing the height and width of the bitmap and maximum faces to detect. Once we created the FaceDetector object, we can use the findFaces method to start face detection. The findFaces method accepts a bitmap and an array of Face objects. The length of the array should be equal to the number we passed to the FaceDetector constructor. Once the system finished detecting faces, the method returns the number of faces detected and fills the array with the information about the faces detected.

On limitation of this class is that it only supports bitmaps in RGB_565 format. So if you have a bitmap in another format you have to convert it to RGB_565 format.

The sample application is a simple one just to show the use of FaceDetector class. It allows you to take a picture using the camera. Once the picture is taken the bitmap is converted to RGB_565 format and pass it to FaceDetector and draws a red rectangle around the detected faces. Hope this is helpful to you.

Happy Face Detection! Smile

Reference:
http://android-er.blogspot.com/2010/05/android-facedetector.html

Service applications in Android, part 1

Download the source code of this article.

A service application in Android is an application that runs in background. Service applications does not have user interface. Service application runs in the background and performs some long-running tasks such as playing an audio file, download files from internet, etc. A Service application can also be used to extend the functionality of a normal application. For example a download manager may accept download requests from the user and send it to a service which actually performs the download. An application can use Context.startService() and Context.stopService() to start and stop a service.

Service lifecycle

When an application calls Context.startService(), the system first call the Service.onCreate() method, then call the Service.onStartCommand() method. At this point the service is running. When the application calls Context.stopService() method, the system calls the Service.onDestry() method. This is true in case of Android 2.0 or above. In Android version prior to 2.0, there is no onStartCommand() method, the system will call Service.onStart() instead. If the service is already running the system will not call Service.onCreate() method in both cases. In all version when the service is destroyed, the system will call Service.onDestroy() method.

A service can also be started by using Context.bindService() method. Binding is used by an application to communicate with the service. In this case system will first call Service.onCreate() method and then call the Service.onBind() method. Note that if binding is used the system will not call the Service.onStartCommand() or Service.onStart() method. For this part of the article let’s forget about the binding and concentrate on the Context.startService().  I will explain the binding in the next part of the article.

Creating a service

A service application can be created by extending the Service class. Following code snippet creates a service:

public class GPSLoggerService extends Service
{
	public GPSLoggerService() {
		AppLog.logString("GPSLoggerService.GPSLoggerService().");
	}

	@Override
	public IBinder onBind(Intent intent) {
		AppLog.logString("GPSLoggerService.onBind().");

		return null;
	}

	@Override
	public void onCreate() {
		AppLog.logString("GPSLoggerService.onCreate().");

		super.onCreate();
	}

	@Override
	public void onStart(Intent intent, int startId) {
		AppLog.logString("GPSLoggerService.onStart().");

		super.onStart(intent, startId);
	}

	public int onStartCommand(Intent intent, int flags, int startId) {
		AppLog.logString("GPSLoggerService.onStartCommand().");

		return Service.START_STICKY;
	}
}

This creates a service called GPSLoggerService (from sample application).  Note that from onBind() method we are returning null because we are not using binding in this example.

The Service.onStart() is called for pre 2.0 versions. There are two arguments for this method, first one the intent supplied to start the service and second one is a unique identifier to represent this specific request. In most cases we need only the Intent parameter. The Intent parameter can be null in case of restarting a service by the system. This normally happens after the service is crashed in the middle or after the system kills the service to make more memory (this happens very rarely because the system gives more priority the service than the normal applications). If the intent parameter is not null we can use it to get some data (if any) that may have passed when starting the service. The onStart() method does not return any values.

The Service.onStartCommand() is called for 2.0 or later. There is one additional parameter flags, this can either be 0, START_FLAG_REDELIVERY or START_FLAG_RETRY. This parameter is rarely used, for more information about these flags please follow this link. The onStartCommand() method returns and integer value. This can be Service.START_STICKY, Service.START_NOT_STICKY, Service.START_STICKY_COMPATIBILITY, etc. Normally a service returns Service.START_STICKY and this means that the service will be restarted if it is killed before returning from Service.onStartCommand(). For a more explanation of return commands from service please follow this link.

You may have noticed for the onStartCommand() method, I haven’t used @Override, this is because I want to make this service compatible to Android versions prior to 2.0. In Android versions prior to 2.0 there is no Service.onStartCommand() method. So if you use @Override and run it on Android 2.0 or below, it will result in application crash. So please not this point while you are creating a server targeted for all version including below 2.0. If you are targeting only 2.0 or above you can use safely use @Override.

Declaring service in AndroidManifest.xml file

To use the service you have to specify the service in your AndroidManifest.xml file otherwise the system will not recognize the service. Following snippet declares the GPSLoggerService in the manifest file.

<service android:name="com.varma.samples.gpslogger.service.GPSLoggerService"/>

Starting a service

To start a service we can use Context.startService() passing Intent of the service. Following code snippet start the GPSLoggerService.

Intent intent = new Intent(context, GPSLoggerService.class);

context.startService(intent);

If you want to pass some data to the service, you can use the intent data and from the service we can retrieve this data using Bundle.

Stopping a service

To stop a service from an application, we can use Context.stopService() method by passing the service intent. Following code snippet stops the GPSLoggingService.

Intent intent = new Intent(context, GPSLoggerService.class);

context.stopService(intent);

To stop the service from the service itself, we can use the Service.stopSelf() method. This will stop the calling service.

Sample application

The sample application of this article is GPS Logger application, it logs the GPS coordinates of the device in specified interval of time. There are two components in this application, service component and the normal application. Some screenshots of the application is given below:

gpslogger

application screen

gpslogger-intervalsetting logging interval

When the ‘Start Logging’ button is clicked, an alarm is set using AlarmManager. AlarmManager.setRepeating() method is used to set repeating alarm with the specified interval. By default the application will use 5 minutes for the logging interval.

In the service component when started LocationManager is initialized and requests for the location updates. Once the location is received, the coordinates are written to a KML file in “/sdcard/gpslogger” folder. Before writing to the file, a reverse geocoding is performed to retrieve the name of the location. For more information about using GPS in Android refer to my article about using GPS in Android and for geocoding and reverse geocoding refer to my article about geocoding.

Happy GPS logging! Smile

Reading EXIF information from a picture file in Android

Download the source code of this article.

EXIF (Exchangeable Image File Format) is a specification for image file formats to store different metadata information. EXIF is supported in JPEG, TIFF and RIFF WAV file formats but not supported in PNG or GIF files. Modern digital cameras use this heavily to store metadata information. EXIF metadata is stored in tag-value format. There are different well known metadata tag values and we can define our own tag values. Most of the modern image manipulation software can interpret EXIF information. For more information about the EXIF format please follow this link.

In Android we can read the EXIF information of a file by using ExifInterface class. This is available from Android version 2.0 onwards. To read EXIF information first we create an instance of the ExifInterface class by passing full path of the image file. Then we can read information using ExifInterface.getAttribute(String), ExifInterface.getAttributeDouble(String) or ExifInterface.getAttributeInt(String) methods by passing the desired tag value. There are different tag values, some of them are:

  • ExifInterface.TAG_DATETIME
  • ExifInterface.TAG_FLASH
  • ExifInterface.TAG_FOCAL_LENGTH
  • ExifInterface.TAG_GPS_DATESTAMP
  • ExifInterface.TAG_GPS_LATITUDE
  • ExifInterface.TAG_GPS_LATITUDE_REF
  • ExifInterface.TAG_GPS_LONGITUDE
  • ExifInterface.TAG_GPS_LONGITUDE_REF
  • ExifInterface.TAG_GPS_PROCESSING_METHOD
  • ExifInterface.TAG_GPS_TIMESTAMP
  • ExifInterface.TAG_IMAGE_LENGTH
  • ExifInterface.TAG_IMAGE_WIDTH
  • ExifInterface.TAG_MAKE
  • ExifInterface.TAG_MODEL
  • ExifInterface.TAG_ORIENTATION
  • ExifInterface.TAG_WHITE_BALANCE

The sample application list all the picture files in the external storage media and when an item is clicked its EXIF information is displayed in another activity.

Playing audio in Android

Download source code of this article.

There are more than one way to play audio and video content in Android, but the easiest way to play media content is by using MediaPlayer. MediaPlayer class encapsulate all the functionalities needed to play, seek, stream, etc. This article is about playing audio using MediaPlayer class. Video playback will be explained in another article (in near future :)).

MediaPlayer class can play all the supported audio content. It can play audio from application resource, file or from a streaming source. Playback of audio or video content is managed as a state machine. For more about the state management please refer to this link.

Following are the steps to play an audio content from an application resource:

1. Create a media player using MediaPlayer.create(context,resourceid)
2. Call MediaPlayer.prepare() method and
3. Call MediaPlayer.start() method.

Following code snippet plays an audio resource:

MediaPlayer player = MediaPlayer(this,R.raw.soundfile);

player.prepare();
player.start();

.
.
.

player.stop();
player.reset();
player.release();

player = null;

To play an audio from a file or a stream:

1. Create an instance of MdiaPlayer using new
2. Set the data source using MediaPlayer.setDataSource()
3. Call MediaPlayer.prepare();
4. Call MediaPlayer.start();

Following code snippet plays an audio file:

MediaPlayer player = new MediaPlayer();

player.setDataSource(“/mnt/sdcard/media/audiofile.mp3”);
player.prepare();
player.start();

.
.
.

player.stop();
player.reset();
player.release();

player = null;

MediaPlayer is a very limited resource, so it is important to stop and release the player object once you finished playing the media, otherwise other application using the media player may not get the resource.

MediaPlayer class has different methods to manage the media playback, some of them are:

getDuration() returns the duration of the media in milliseconds.
getCurrentPosition() returns the current playback position.
isPlaying() returns true if playing is in progress otherwise false.
isLooping() returns true is looping is set otherwise false.
pause() pause the playback.
start() start or resume the playback.
seekTo() seek to the specified position.
setLooping() set looping enabled or disabled.
setVolume() set the playback volume.

MediaPlayer uses different callback interfaces to report the events, they are:

MediaPlayer.OnBufferingUpdateListener This interface is used to report the buffering progress when media is streamed from a source.

void onBufferingUpdate(MediaPlayer mp, int percent) method is called when the buffer is updated.

The first parameter mp is the MediaPlayer object and the second parameter percent is the percentage of buffering completed.

MediaPlayer.OnCompletionListener This interface is used to report the completion of the playback.

void onCompletion(MediaPlayer mp) method is called when the playback is completed.

The only parameter mp is the MediaPlayer object.

MediaPlayer.OnErrorListener This interface is used to report any error occurred while the playback is in progress.

boolean onError(MediaPlayer mp, int what, int extra) method will be called when an error occurred.

The first parameter mp is the MediaPlayer object, the second parameter what is the error occurred and the last parameter extra is the extra error information.

If you return true the playback will continue, if false is returned then playback is stopped and the OnCompletionListener will be called.

MediaPlayer.OnInfoListener This interface is used to report any warning or information has to be reported.

boolean onInfo(MediaPlayer mp, int what, int extra) method is called when a warning/info is there.

The first parameter mp is the MediaPlayer object, the second parameter what is the error occurred and the last parameter extra is the extra error information.

Return value true means the information/warning is handled, false means the information is discarded. In both the cases the playback will continue.

MediaPlayer.OnPreparedListener This interface is used to report when asynchronous method prepareAsync() is used to prepare the MediaPlayer.

void onPrepared(MediaPlayer mp) method is called when the prepare is completed.

The only parameter mp is the MediaPlayer object.

MediaPlayer.OnSeekCompleteListener This interface is used to report the completion of the seek operation.

void onSeekComplete(MediaPlayer mp) method will be called when the seek operation is completed.

The only parameter mp is the MediaPlayer object.

MediaPlayer.OnVideoSizeChangedListener This interface is used to report the a change in size of video playback. This is not called in case of audio playback.

void onVideoSizeChanged(MediaPlayer mp, int width, int height) method is called once the size is changed.

The first parameter mp is the MediaPlayer object, second parameter width is the new width and the third parameter height is the new height.

Sample application of this article lists all the audio files in the external storage media. To enumerate the audio files I used the MediaStore.Audio.Media.EXTERNAL_CONTENT_URI content URI. This returns the cursor containing all the supported audio files in the external storage media. When an item is clicked, the application start playing the particular audio file. To show the current playback position I used a Handler class. and a Runnable command that will be fired every second to update the current playback position. We could use the default MediaController for this purpose but practically I faced a problem of not updating seek bar continuously. I haven’t figured it out why it is not updating. A screenshot of the application is shown below:

device

Hope this article is helpful for you.

Happy coding! 🙂

Camera effects in Android 2.0 and above

Download source code of this article.

This is a continuation of my previous article Programming camera in Android which explains how the program the Android camera hardware. This article concentrates on how to set camera effects. From Android 2.0 onwards we can set different effects such as color effect, white balance, zoom, etc. Android 2.0 onwards the camera API supports Anti-banding mode, color effects, scene mode, white balance, etc..

These effects can be programmatically set using camera parameters. Following code snippet sets the color effect to EFFECT_AQUA.

Camera.Parameters parameters = camera.getParameters();

parameters.setColorEffect(Camera.Parameters.EFFECT_AQUA);

camera.setParameters(parameters);

These effects depends on the camera hardware of the device and may not support all the available effects. following are some of the methods to support these effects:

Camera.Parameters.setColorEffect()
Camera.Parameters.getColorEffect()
Get or Set the color effects, supported color effects are:

Camera.Parameters.EFFECT_AQUA
Camera.Parameters.EFFECT_BLACKBOARD
Camera.Parameters.EFFECT_MONO
Camera.Parameters.EFFECT_NEGATIVE
Camera.Parameters.EFFECT_NONE
Camera.Parameters.EFFECT_POSTERIZE
Camera.Parameters.EFFECT_SEPIA
Camera.Parameters.EFFECT_SOLARIZE
Camera.Parameters.EFFECT_WHITEBOARD

Camera.Parameters.setFlashMode()
Camera.Parameters.getFlashMode()
Get or Set the flash mode, supported flash modes are:

Camera.Parameters.FLASH_MODE_AUTO
Camera.Parameters.FLASH_MODE_OFF
Camera.Parameters.FLASH_MODE_ON
Camera.Parameters.FLASH_MODE_RED_EYE
Camera.Parameters.FLASH_MODE_TORCH

Camera.Parameters.setSceneMode()
Camera.Parameters.getSceneMode()
Get or Set the scene mode, supported scene modes are:

Camera.Parameters.SCENE_MODE_ACTION
Camera.Parameters.SCENE_MODE_AUTO
Camera.Parameters.SCENE_MODE_BARCODE
Camera.Parameters.SCENE_MODE_BEACH
Camera.Parameters.SCENE_MODE_CANDLELIGHT
Camera.Parameters.SCENE_MODE_FIREWORKS
Camera.Parameters.SCENE_MODE_LANDSCAPE
Camera.Parameters.SCENE_MODE_NIGHT
Camera.Parameters.SCENE_MODE_NIGHT_PORTRAIT
Camera.Parameters.SCENE_MODE_PARTY
Camera.Parameters.SCENE_MODE_PORTRAIT
Camera.Parameters.SCENE_MODE_SNOW
Camera.Parameters.SCENE_MODE_SPORTS
Camera.Parameters.SCENE_MODE_STEADYPHOTO
Camera.Parameters.SCENE_MODE_SUNSET
Camera.Parameters.SCENE_MODE_THEATRE

Note that setting scene mode may effect other camera effects.

Camera.Parameters.setWhiteBalance()
Camera.Parameters.getWhiteBalance()
Get or Set the white balance, supported white balance modes are:

Camera.Parameters.WHITE_BALANCE_AUTO
Camera.Parameters.WHITE_BALANCE_CLOUDY_DAYLIGHT
Camera.Parameters.WHITE_BALANCE_DAYLIGHT
Camera.Parameters.WHITE_BALANCE_FLUORESCENT
Camera.Parameters.WHITE_BALANCE_INCANDESCENT
Camera.Parameters.WHITE_BALANCE_SHADE
Camera.Parameters.WHITE_BALANCE_TWILIGHT
Camera.Parameters.WHITE_BALANCE_WARM_FLUORESCENT

Camera.Parameters.setFocusMode()
Camera.Parameters.getFocusMode()
Get or Set the focus mode, supported focus modes are:

Camera.Parameters.FOCUS_MODE_AUTO
Camera.Parameters.FOCUS_MODE_EDOF
Camera.Parameters.FOCUS_MODE_FIXED
Camera.Parameters.FOCUS_MODE_INFINITY
Camera.Parameters.FOCUS_MODE_MACRO

All the camera hardware may not support these effects, to get the supported effects Camera.Parameters provides different methods. All these methods returns a list of strings. Some of them are:

Camera.Parameters.getSupportedAntibanding() Returns a list of anti-banding modes supported.
Camera.Parameters.getSupportedColorEffects() Returns a list of color effects supported.
Camera.Parameters.getSupportedFlashModes() Returns a list of flash modes supported.
Camera.Parameters.getSupportedFocusModes() Returns a list of focus modes supported.
Camera.Parameters.getSupportedSceneModes() Returns a list of scene modes supported.
Camera.Parameters.getSupportedWhiteBalance() Returns a list of white balance modes supported.

Setting some of the effects may change the other effects, for example setting scene mode may effect the flash mode, color effect, white balance, etc.

The sample application of this article is same as my previous article but with color effect and white balance effects added. When you clock on the color effect or white balance button, supported effects will be listed and the selected effect will be set.

Hope this piece of information is helpful to you.

Happy camera coding! Smile