Page 2 of 12

Using Spark Core and XBee for RF Communication, Part #1

https://github.com/krvarma/XBee_SparkCore

Here is one of the project I worked these days, this is a part one that uses Spark Core and XBee for RF Communication. My intention is to use XBee(s) and Spark Core to create very simple Home Automation project. In this part, I just turn on/off a remote LED which can be controlled using a Web Application.

The project is to control a remote light, i.e. turn on/off the light. In this part I am using Spark Core and two XBee’s (either Series 1 or Series 2). If you are using Series 1, then one should be configured to accept API Command, other can be wither API or AT mode. If you are using Series 2, then one should be configured as coordinator API and other can either be a Router AT/API or an End Device.

Introduction

Here in this project the Coordinator API XBee is connected to Spark Core and communicate using Serial. Spark Core is used as Internet Gateway to control XBee. The Core application uses XBee Library ported by @pkourany. For this project I am using XBee Remote AT command to control remote XBee(s). For more about API Mode and Remote AT Commands please refer to this link.

The Router XBee’s D0, D1, D2 and D3 pin are connected to four LEDs which can be remotely turn on/off. Any device can be connected to any pins and send the Remote AT commands to the remote XBee. It can a Rely connected to Electrical Light, or anything else.

To send Remote AT Command, you can either use a Broadcast address or a destination 64-bit (MAC/EUI64) address of the XBee module. Using the 64-bit destination address allows us to send commands to a particular XBee. There is also a 16-bit configurable address which can be used to group multiple XBees and send commands to that group. Since this is a different topic and is not in our scope, for more information refer to the Digi documentation.

The sample JavaScript Web application is used to control the remote light. To use this application you should replace the deviceid and accesstoken with actual values. When the application is opened you will be asked to enter the Name (any name, just for display purpose) and the 64-bit remote address of the XBee that is connected to the Relay Module. Once it is entered you can turn on/off the D0, D1, D2 and D3 pins.

Here you can find a very useful API Frame Generator from Digi International. This is extremely useful for the development and debugging, I used it a lot for this project.

Wiring

Coordinator XBee and Spark Core

  • XBee Vcc to Spark Core 3V3
  • XBee GND to Spark Core GND
  • XBee DOUT to Spark Core Rx
  • XBee DIN to Spark Core Tx

Router XBee and Relay Module

  • XBee Vcc to 3v3
  • XBee GND to GND
  • XBee DIO0 to LED1
  • XBee DIO1 to LED2
  • XBee DIO2 to LED3
  • XBee DIO3 to LED3

Screenshots

enter image description here

enter image description here

enter image description here

enter image description here

enter image description here

enter image description here

Demo Video

https://www.youtube.com/watch?v=n_4AgTekWqs

Spark Core and SmartThings

https://github.com/krvarma/SmartThings_SparkCore_Sensor

This is a sample application using Spark Core and SmartThings.

The Spark Core application uses DHT22 Sensor for Temperature and Humidity value and defines two Spark.function to read these sensor values.

The SmartThings application defines a new Device Type and uses Spark Core APIs to read the sensor values. The device type specifies the “Polling” capability but only sometimes it is working (Raised the issue and I am awaiting SmartThings reply on this)

Wiring

  1. DHT 3.3v to 3.3v
  2. DHT GND to GND
  3. DHT SIG to D4

Installation

  1. Create a new device type (https://graph.api.smartthings.com/ide/devices) with Name: Spark Core Temperature Sensor, Author: Krishnaraj Varma, Capabilities: Polling Relative Humidity Measurement Sensor Temperature Measurement
  2. Create a new device (https://graph.api.smartthings.com/device/list) with Name: Your Choice, Device Network Id: Your Choicem Type: Spark Core Temperature Sensor (should be the last option), Location: Choose the correct location, Hub/Group: Leave blank
  3. Update device preferences, Click on the new device to see the details. Click the edit button next to Preferences and Enter the Device ID and Access Token
  4. Open the Mobile Application and add the newly created device, click refresh to see the Temperature and Humidity values

Screenshots

enter image description here

enter image description here

Reference and Inspiration

https://gist.github.com/Dianoga/6055918

LIFX WiFi Bulb and SmartThings

https://github.com/krvarma/LIFX_SmartThings

This is a sample application to control LIFX WiFi Bulb using SmartThings. LIFX can only be controlled within local network, as of now we cannot control LIFX from outside the network. One solution to control from outside network is to use an intermediate local server which listens for commands and control the bulb locally. There an official LIFX Ruby SDK. There are also some unofficial node.js library which can control the LIFX bulb, on such library is Lifxjs.

To communicate to the local server from outside the network, we can use MQTT, Server Sent Events, etc… For this demo I am using MQTT. Eclipse foundation is maintaining a Sandbox MQTT Server for development purpose. We can use MQTT clients to publish and subscribe messages, there is also one HTTP Bridge which provides RESTful APIs to send and receive messages. This is very useful service, we just need to call REST services and we can easily manage the messages. For this one I am using the REST APIs.

In this demo, I created a SmartThings Custom Device type LIFX with ‘Switch’ capability. Whenever the switch turns on the application calls the MQTT HTTP Bridge service and publishes a message with topic “lifxcmd” and payload command=’1’ and when it turns off publish with payload command=’0’. The local node.js server listens for this topic and whenever it receives it turns the bulb on/off.

This is a pretty straight forward to control the bulb. Currently I only implemented on/off functionality but planning to incorporate more in near future.

Here is a demo video of the application in action (sorry, no audio). http://www.youtube.com/watch?v=5D6cQRWK3Yc

Happy LIFX and SmartThings coding 🙂

Spark Core and TinyGPS Library

https://github.com/krvarma/TinyGPS_SparkCore

Here comes a sample application using Spark Core and TinyGPS library. TinyGPS is a very powerful and fast NMEA GPS parser for Arduino and compatible. In this sample I am using on some basic features to read the geo-coordinates from a MediaTek MT3329 GPS receiver.

The sample publishes the GPS location every 15 minutes. There is also an HMTL page that uses Google Map API to display Spark Core location on a Map. To use this sketch you should replace the tag deviceid and accesstoken with actual values.

Wiring

  1. GPS Vcc to 3.3V
  2. GPS GND to GND
  3. GPS Tx to Rx
  4. GPS Rx to Tx

Screenshot

enter image description here

enter image description here

Happy GPS coding 🙂

Spark Core and Plotly

https://github.com/krvarma/Plotly_SparkCore

Here is a sample application to stream data to Plotly using Spark Core. The application stream temperature value from DHT22 Sensor to Plotly. To use this application create a Plotly account, download the source and open it in Spark Editor and replace the tokens streamtoken, username, apikey and filename with corresponding values from Plotly settings.

This application uses Plotly SDK form GitHub. I took the plotly_streaming_wifi source and ported to Spark Core. It is only a simple and straightforward port, only two modification done. 1. Copy the method dtostrf form this location and included in Plotly class and 2. commented the method void print_(const __FlashStringHelper*d) because this method was causing some compiler errors (I didn’t dig deep into it, just commented the method).

Screenshots

spark core wiring

enter image description here

Creating AppWidget in Android, part 1

Download the source code of this article.

AppWidgets are small views or widgets that can be embedded in another application. The containing application is called App Widget Host. One example of host application is the Home Screen application. When you long-press the home screen and selecting Widgets option will present you a list of widgets available. This is a two part series on how to create App Widgets. In this part I will stick on to the basics of creating the widget. In the second part I will explain how to provide more advanced topics.

We can create our own App Widgets by extending AppWidgetProvider class, which is actually a BroadcastReceiver with action android.appwidget.action.APPWIDGET_UPDATE. To create an App Widget we extends this class and override the onUpdate method. Then we have to specify the App Widget in AndroidManifest.xml file using the <receiver> tag. Finally we have to describe the App Widget in using AppWidgetProviderInfo object in an XML file.

Creating AppWidget

To create and App Widget, we create a class extending AppWidgetProvider. The AppWidgetProvider class has following methods:

void onEnabled(Context context)
void onDisabled(Context context)
void onDeleted(Context context, int[] appWidgetIds)
void onUpdate(Context context,
	AppWidgetManager appWidgetManager, int[] appWidgetIds)
void onReceive(Context context, Intent intent)

The method onEnabled is called when an instance is create for the first time, i.e. when the user add the widget for the first time. It will not be called for subsequent additions. In this method we can perform any startup tasks.

The method onDisabled is called when the last instance of the widget is deleted. This method is to do some cleanup tasks.

The onDeleted is called when each instance of the widget is deleted.

The onUpdate method is first called when the widget is added to the host. After that this method will be called after the specified time intervals. This is the most important method of the App Widget. In this method we update the view with the data we want to display. If the data is readily available, we can display it in this method itself, if the data is not available and need to be fetched then it is better to create a Service application which actually fetch the data and update the widget. This is to avoid the ANR (Application Not Responding) error. We can update the widgets using AppWidgetManager’s updateAppWidget() method.

Declaring the App Widget

We have to declare the App Widget in the AndroidManifest.xml file using the <receiver> tag. As I told you before the AppWidgetProvider is actually is a BroadcastReceiver. Anything that can be done using this class can also be done using the BroadcastReceiver. Following snippet specifies the GPS App Widget:

<receiver android:name=".GPSWidgetProvider">
	<intent-filter>
		<action android:name="android.appwidget.action.APPWIDGET_UPDATE" />
	</intent-filter>
	<meta-data
		android:name="android.appwidget.provider"
		android:resource="@xml/gpswidgetinfo" />
</receiver>

Everything is same as specifying BroadcastReceiver except the meta-data section, here we describe our App Widget in a separate XML file.

Describing the App Widget

App Widget description is specified in a separate XML file. In this XML file we specify the minimum width and height of the widget, update period, widget layout, configuration screen layout, etc. Following snippet describes the GPS App Widget:

<appwidget-provider xmlns:android="http://schemas.android.com/apk/res/android"
    android:minWidth="294dp"
    android:minHeight="72dp"
    android:updatePeriodMillis="900000"
    android:initialLayout="@layout/gpswidget">
<appwidget-provider>

The minWidth specifies the minimum width the widget, minHeight specifies the minimum height of the widget. The updatePeriodMillis specifies the update period in milliseconds. The initialLayout specifies the widget’s layout.

Sample application

The sample application provided is a widget that displays the GPS coordinates the device. The GPS coordinates are reverse geo-coded to get the name of location and will be displayed if it is available. To reverse geo-code the location I used the class Geocoder. The widget will be updated in every 15 minutes.

Happy App Widget coding Smile

A bare minimum web server for android platform

Download the source code of this article.

Since today’s mobile phone are having more computing power, running a web server on mobile phone can do lot more. There are lots of  Android Web Server applications out there that you can try. But I decided to create one my own.  We can create HTTP Server using org.apache.http packages, the only thing is that we could not run the server with default port, i.e. port 80. I think this is a Linux restriction and noting to do with Android platform.

To create a typical Web Server, first you want to create a server socket and listen to the desired port. Then accept the connection and finally process the HTTP request and send the response to the client.

To create a server socket we use the class java.net.ServerSocket. The constructor of this class accepts a port number to listen for the incoming connections. Once the ServerSocket object is created, we accept the incoming connection using ServerSocket.accept() method. This method is blocking so we create a separate thread to accept and process the incoming connections. The accept() method returns a java.net.Socket object which represents the accepted connection. Once the connection established, next we want to process the HTTP request. This can be done using org.apache.http.protocol.HttpService class. This class provides a minimal HTTP processor server side implementation. Following is the HttpService constructor:

HttpService(
	HttpProcessor proc,
	ConnectionReuseStrategy connStrategy,
	HttpResponseFactory responseFactory)

This class is from the Apache implementation for Android. Explanation of Apache implementation is out of scope of this article. For more information about this please follow this link.

The first parameter org.apache.http.protocol.HttpProcessor is an interface that is used to process the requests and response. The class org.apache.http.protocol.BasicHttpProcessor is a default implementation of this interface.

The second parameter org.apache.http.ConnectionReuseStrategy determines the connection reuse strategy. There two different implementation of this interface; org.apache.http.impl.DefaultConnectionReuseStrategy and org.apache.http.impl.NoConnectionReuseStrategy. The first one re-uses the connection and the second one never re-use the connection. In normal cases we use the first one.

The HttpService class relies on the last parameter to process and send the HTTP response. org.apache.http.HttpResponseFactory is a factory interface that process and send the response to the client. The class org.apache.http.impl.DefaultHttpResponseFactory is a default implementation of this class.

To handle different HTTP requests, we use a handler map based on the URI pattern. For example for all the URIs which starts with /message will be handled by a class and URIs which starts with /dir will be handled by another class. For this purpose we use org.apache.http.protocol.HttpRequestHandlerRegistry class to create a map based on the URI pattern. This class implements org.apache.http.protocol.HttpRequestHandlerResolver which handles different request handlers. Using org.apache.http.protocol.HttpRequestHandlerRegistry we can register different URIs patterns and corresponding org.apache.http.protocol.HttpRequestHandler to handle the requests. HttpRequestHandler is a class that handles the HTTP request. We can create different HttpRequestHandler classes to handle HTTP requests that matches a particular pattern. To register a URI pattern we use the HttpRequestHandlerRegistry.register() method. The syntax of this method is:

public void register (String pattern, HttpRequestHandler handler)

The first parameter is the URI pattern. This can be one of the following:

  • * – handles all requests
  • *<uri> – handles all requests that has <uri> at the end
  • <uri>* – handles all requests that starts with <uri>

The HttpService class will determine which handler should use based on the URI that is received. To handle the HTTP requests we use HttpService.handleRequest() method. The syntax of the method is:

public void handleRequest (HttpServerConnection conn, HttpContext context)

The first parameter is org.apache.http.HttpServerConnection interface for use on the server side. We can us the org.apache.http.impl.DefaultHttpServerConnection class for this purpose which can be created by passing the Socket we received. Second parameter is an org.apache.http.protocol.HttpContext interface, again the org.apache.http.protocol.DefaultedHttpContext implementation can be used.

Sample Application

The sample application is a bare minimum web server. The server provided two functionality; send a message to the device and folder/file listing. To access the server we should know the IP address of the device, then we can use the URL http://<deviceip>:<port>. This will present the home page. This one provides you a very minimum functionality. I will be adding more functionality in future and will post the changes here.

You know, it’s very hard to explain a complicated subject in a short article like this. But I hope this gives an introduction to the web server implementation in Android.

Android audio recording, part 2

Download source code of this article.

My previous article was about recording audio using MediaRecorder class. This class is very easy to use and can be used for quick and simple audio recording, also this class saves the audio data either MPEG 4 or 3GPP compressed format. But if you need raw data that can be used to process audio data, then you cannot use MediaRecorder class but you have to use AudioRecord class. AudioRecord class provides you the raw data in uncompressed format. You can use this data to write to a file, display waveform, etc..

To record audio using AudioRecord class:

  1. Create an instance of AudioRecord
  2. Start recording using startRecording() method
  3. Read uncompressed data using AudioRecord.read() method
  4. Stop recording using stop() method.

Creating AudioRecord instance

The constructor of AudioRecord class is:

AudioRecord(
	int audioSource, int sampleRateInHz,
	int channelConfig, int audioFormat, int bufferSizeInBytes)

The first parameter is the audio source, this can be one of the AudioSource values. Second parameter is the sample rate in Hz. This can be 44100, 22050, 11025, 8000, etc.. Third parameter is channel value, this can be one of the AudioFormat values (normally CHANNEL_IN_MONO or CHANNEL_IN_STEREO). Third is the audio encoder format, this can be one of the following values:

  1. ENCODING_PCM_16BIT – 16 bit per sample
  2. ENCODING_PCM_8BIT – 8 bit per sample
  3. ENCODING_DEFAUL – default encoding

Forth parameter is the buffer size. This should be calculated using AudioRecord.getMinBufferSize() static method. The syntax of getMinBufferSize() method is:

public static int getMinBufferSize (
	int sampleRateInHz, int channelConfig, int audioFormat)

The parameters are same as that of the constructor. The method calculate the minimum buffer size needed to store one chunk of audio. Anything less than this number will result in failure while creating the AudioRecord object.

Start Recording

Once we create an instance of the AudioRecord object, startRecording() method is used to start recording audio.

Reading uncompressed data

Once the recording is started, it is the responsibility of the application to read the audio data and store it for further processing. We can read the audio data using one of the read method of AudioRecord class. The audio data will be raw PCM_16BIT or PCM_8BIT format depending on how you initialized the object. The application can use this data to do further processing of any sort. In our sample application we just save this data to a temporary file, further when the recording is stopped we read this data and write to WAV file. The application has to continuously read the data otherwise the previous chunk of data will be overwritten by the new one.

Stop recording

Recording can be stopped using stop() method. Remember to call release() method to release the AudioRecord object.

The following code snippet shows the typical usage:

AudioRecord recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,
	44100, AudioFormat.CHANNEL_IN_STEREO,
	AudioFormat.ENCODING_PCM_16BIT, bufferSize);

recorder.startRecording();
.
.
.
recorder.stop();
recorder.release();

Sample application

The sample application of this article records audio and save it to a WAV file. The WAV file will be placed in “/SDCard/AudioRecorder” folder with current millisecond as the file name.

Reference

Writing WAV file – RingDroid sample application, RehearsalAssist application

Android audio recording, part 1

Download source code of this article.

In Android we can record audio in two different ways; using MediaRecorder class and using AudioRecord class. Using MediaRecorder class is very easy but gives you less flexibility. AudioRecord class provides you more flexibility but is little bit complex. This article is about recording audio using MediaRecorder, I will explain the AudioRecord class in another article.

Using MediaRecorder class we can record the audio in two different formats, MediaRecorder.OutputFormat.THREE_GPP or MediaRecorder.OutputFormat.MPEG_4. In both cases the encoder should be MediaRecorder.AudioEncoder.AMR_NB. Till version 2.2, Android does not support other encoders. This link gives the details of media formats supported in Android.

To record an audio using MediaRecorder:

  1. Create an instance of MediaRecorder
  2. Set the audio source using setAudioSource() method
  3. Set output format using setOutputFormat() method
  4. Set Audio encoder using setAudioEncoder() method
  5. Set output file path using setOutputPath() method
  6. prepare using prepare() method
  7. start recording using start() method

The following code snippet starts audio recording:

recorder = new MediaRecorder();

recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
recorder.setOutputFile("/sdcard/sample.3gp");

recorder.setOnErrorListener(errorListener);
recorder.setOnInfoListener(infoListener);

try {
	recorder.prepare();
	recorder.start();
} catch (IllegalStateException e) {
	e.printStackTrace();
} catch (IOException e) {
	e.printStackTrace();
}

And following code snippet stops recording:

recorder.stop();
recorder.reset();
recorder.release();

recorder = null;

Note that we have to reset and release the MediaRecorder object because it is a limited resource in Android, if it is not released other applications using MediaRecorder may not get this resource.

MediaRecorder class uses event listener interface to report any errors or warning occurred during recording session. There are two event listeners MediaRecorder.OnErrorListener and MediaRecorder.OnInfoListener. The MediaRecorder.OnErrorListener is used to report any errors during recording. It has the following method:

public void onError(MediaRecorder mr, int what, int extra)

This method is called when an error occurred, the first parameter is the MediaRecorder object, second one is the error code and third one is the extra information regarding the error occurred.

MediaRecorder.OnInfoListener is used to report any warning occurred. It has the following method:

public void onInfo(MediaRecorder mr, int what, int extra)

This method is called when any warning occurred during recording, the first parameter is the MediaRecorder object, second one is the warning code and third one is the extra information about the warning occurred.

The Audio Source can be one of the following values:

  1. MediaRecorder.AudioSource.DEFAULT – default source usually MIC
  2. MediaRecorder.AudioSource.MIC – Microphone
  3. MediaRecorder.AudioSource.VOICE_CALL – Voice call uplink and downlink source
  4. MediaRecorder.AudioSource.VOICE_DOWNLINK – Voice call downlink source
  5. MediaRecorder.AudioSource.VOICE_UPLINK – Voice call uplink source
  6. MediaRecorder.AudioSource.VOICE_RECOGNITION – Usually DEFAULT source

Even though we can set any one the above Audio Source, from my experience only MediaRecorder.AudioSource.MIC is working in my Nexus One with Android 2.2, all others are not working. If anyone can record using other sources please comment here.

Sample Application

The sample application is a simple one that records audio from MIC and stores the file in “/SDCard/AudioRecorder” folder with current milliseconds as the filename. A sample screen shot is displayed below:

Audio Recorder main screen Audio Recorder choose format screen

Hope this is piece of information is useful to you all.

Creating Android live wallpaper

Download source code of this article.

As you all know live wallpaper is introduced in Android from version 2.1 onwards. Live wallpapers are animated backgrounds for your home screen. You can set your live wallpaper by long pressing the home screen and selecting Wallpapers->Live Wallpaper menu item. A list of live wallpapers installed in the system will be shown and you can select one of them to make it active. Live Wallpaper is like a normal application Android and can use any feature that normal application uses like MapView, Accelerometer, GPS, etc.

Live Wallpaper is like a normal service application in Android. It has two components, WallpaperService and WallpaperService.Engine. The service component is like normal service application but with extra method onCreateEngine(). The purpose of this method is to return a instance of Wallpaper.Engine class which actually do all the tasks including drawing wallpaper and managing the lifetime. The WallpaperService will call the onCreateEngine() method when a wallpaper is active. A wallpaper can provide a settings screen if needed. Settings screen can be used to configure the wallpaper (in our sample application we provide this settings screen to select the pattern to draw on the wallpaper screen, see the sample application section).

To create our own wallpaper we create the WallpaperService and override the onCreateEngine method. Then specify the wallpaper in AndroidManifest.xml file. Following is sample manifest entry:

<application android:label="@string/app_name" android:icon="@drawable/icon">
	<service android:label="@string/wallpaper_pattern" android:permission="android.permission.BIND_WALLPAPER" android:name="com.varma.samples.patternwallpaper.PatternWallpaper">
        	<intent-filter>
                	<action android:name="android.service.wallpaper.WallpaperService" />
		</intent-filter>
            	<meta-data android:name="android.service.wallpaper" android:resource="@xml/patternwallpaper" />
        </service>

	<activity android:label="@string/wallpaper_settings" android:name="com.varma.samples.patternwallpaper.PatternWallpaperSettings" android:exported="true"/>

</application>

The <service> specifies our wallpaper service. The <intnet-filter> should be android.service.wallpaper.WallpaperService. We should create an XML resource that specify our wallpaper service and give it in <meta-data> tag to describe the wallpaper. Finally we have to specify the settings activity if there is any. In addition to this, we have to specify the <users-sdk> and <uses-feature> tags, the <users-sdk> tag should specify 7 or above since the feature is only available 7 or above. A sample entry is:

<uses-sdk android:minSdkVersion="7" />
<uses-feature android:name="android.software.live_wallpaper" />

Once we specify these entries we create a service and extend from WallpaperService class and override the onCreateEngine method. From this method we return our own instance of WallpaperService.Engine implementation. The system will provide a SurfaceView class for drawing purpose. The WallpaperService.Engine has following important method which we have to override to implement the wallpaper engine:

public void onCreate(SurfaceHolder surfaceHolder) Called while creating the Wallpaper.Engine object. Note that the system provides a SurfaceHolder implementation
public void onDestroy() Called while destroying the object
public void onOffsetsChanged(

float xOffset,

float yOffset,

float xOffsetStep,

float yOffsetStep,

int xPixelOffset,

int yPixelOffset)

Called when offsets changed.
public void onSurfaceChanged(

SurfaceHolder holder,

int format,

int width,

int height)

Called when the SurfaceHolder’s surface is changed.
public void onSurfaceCreated(SurfaceHolder holder) Called when the SurfaceHolder’s surface is created.
public void onSurfaceDestroyed(SurfaceHolder holder) Called when the SurfaceHolder is destroyed.
public void onVisibilityChanged(boolean visible) Called when the visibility of the wallpaper changed. The parameter visible is true when visible otherwise false. Here we have to start and stop drawing depending on the visibility.
public void onTouchEvent (MotionEvent event) Called when touch events occurred. There are different events, some of them are:

android.wallpaper.tap

when the user taps the home screen.

android.home.drop

when the user drops an icon on home screen.

Our sample application does not use onTouchEvent but a live wallpaper can use this to implement different effects.

Normally while creating wallpaper we override these method and start ad stop the drawing in onVisibilityChanged. For drawing the wallpaper we create separate thread or a runnable routine. The following code snippet creates a runnable routine:

private final Runnable drawrunnable = new Runnable() {
	public void run() {
		draw();
	}
};


 

The draw method actually draws the wallpaper. In this method we post the drawrunnable routine every 5 seconds using a Handler class. The draw method implementation is (from sample application):

private void draw(){
	final SurfaceHolder holder = getSurfaceHolder();
	Canvas canvas = null;

	handler.removeCallbacks(drawrunnable);

	try
	{
		canvas = holder.lockCanvas();

		drawPattern(canvas);
	}
	finally
	{
		if (canvas != null)
			holder.unlockCanvasAndPost(canvas);
	}

	if(isVisible)
	{
		handler.postDelayed(drawrunnable,DRAW_DELAY);
	}
}

Sample Application

The sample application of this article is a simple live wallpaper that draws three different pattern; spirograph, circle pattern and sierpinski triangle. Use can choose the pattern to draw. The pattern is drawn with random values in every 5 seconds.

Following are some of the screenshots of the application:

spirograph circlepattern sierpinski

Hope this article is helpful.

Happy Live Wallpaper coding! Smile