Creating a Great Mashup using Windows Azure

By Brian Hitney

This post details features used by Earthquake Explorer, a Windows 8 app that displays earthquake information on Bing maps. Earthquake Explorer was based off of the Earthquakes mashup starter kit on github.

Creating a mashup is pretty easy in Windows 8 – creating a great one requires a bit more work. In this case, I’m working with the USGS earthquake GeoJSON feeds, a very easy-to-use feed that contains earthquake information for either the past hour, day, week, or month.   Overall, it’s a pretty ideal feed.  But, if you search for and install virtually any earthquake app in the store, you’ll see the limitations quickly.  While some apps allow some lightweight filtering of the data, there really isn’t a concept of search, pagination, and server-side filtering.

To solve these limitations, we’d typically need to build out a fairly robust back-end server; fortunately, Windows Azure Mobile Services makes this really easy so let’s get started.

First thing you need is a Windows Azure account (here’s a link to get a free trial).  To get the pricing stuff out of the way first:  Windows Azure Mobile Services currently offers a free tier (even beyond your trial), however, quite likely you’ll want to use a database behind the scenes, and that starts at $4.95/moEDIT:  Good news!   SQL Databases now offers a free tier for up to 20MB databases. Multiple Mobile Services can share the same database as they are all schematized. 



After a few moments your service is set up and ready to go.  You will need to create a new SQL Database if you don’t already have one.  The default page of your WAMS should look like this, and it’s full of great info for creating new or leveraging WAMS in existing apps:


Let’s click on the data link highlighted above to create a table to store our data.  We’ll call it ‘earthquake’, and we’ll also make it so only scripts and admins can modify the table, all users can read the table, like so:


By default, WAMS is configured with dynamic schema, which means we can start inserting right away and the columns will be added dynamically to fit the objects we pass in.  This is perfect for development.  The next thing we’ll want to do is go over to the scheduled jobs page:


After clicking create a scheduled job, give it a name like ‘fetchquakes’ and leave it run on demand only.   The job will be created and if you are looking at the details of the job by clicking on it, you’ll see a fairly empty screen – we don’t care about the configuration and schedule right now, we simply want to add a script that runs as part of our job.  Click the script button:


Let’s start with a simple script.   Ideally we’ll go out to the service and use live data, but this will get us started:

function fetchquakes() {

    var eq = {place: "Los Angeles", mag: 5.3, timestamp: new Date()};

    var quakesTable = tables.getTable('earthquake');


If we run this script (by clicking the Run Once button near the bottom/middle of the screen) and then navigate over to our table in the Data tab, we’ll see the row in the table:


The nice part about this, too, is that it did a good job of figuring out the column types:


However, we could go into the database itself and change the column definition if we wanted to. We’ll wrap up the post here to keep this from becoming a book, but in the next post we’ll dive a bit deeper into consuming the live feed and storing that in the database.

More by Author

Must Read