An ADO.NET Data Services Tutorial

At some point in the past few years you will have heard of REST services. You will have seen various articles talking about the merits of REST and how it's going to save the planet from global warming in addition to being an architecture meant specifically for the web, the way it was meant to be. This is great, but there are generally very few tools and frameworks that help you create REST services and work with them. ADO.NET Data Services is therefore the messiah - it lets you expose data in a truly RESTful pattern and work with it as well without the headaches and hair loss.

This tutorial will give you an introduction to ADO.NET Data Services and touch upon various operations and features in it. To fully understand this tutorial, it would be helpful if you have some knowledge of these topics, but it's not entirely necessary.

You will also need

Contents

  • Page 1: Introduction and setting up your first Data Service
  • Page 2: Playing with querystring parameters, expressions, operations
  • Page 3: Service Operations (custom functions)
  • Page 4: Using GET, POST, MERGE and DELETE for CRUD operations
  • Page 5: Using Linq to Data Services
  • Page 6: Intercepting Data Service requests (QueryInterceptor and ChangeInterceptor)
  • Page 7: Security, exceptions and tracing

What are ADO.NET Data Services?

ADO.NET Data Services generally sit on top of the ADO.NET Entity Framework and exposes entities through a web service. It exposes these entities in a RESTful way (you can view it in a browser!) which means that the entities being exposed need to be serialized to a certain format. At present, ADO.NET Data Services can expose data as JSON or an ATOM feed. In addition, it also maps the HTTP verbs to CRUD operations in the Entity Framework or your database layer.

GET->SELECT(Read)
POST->INSERT(Create)
MERGE->UPDATE(Update)
DELETE->DELETE(Delete)

When exposing your data, it also allows various querystring parameters to help sort and filter the data being viewed. Further, you can introduce custom methods for business logic to be exposed via the same URL scheme, and intercept the We'll touch upon all of these topics in this tutorial.

Creating an ADO.NET Data Service

To actually get your first ADO.NET Data Service running, you will have a bit of work to do. For a variety of reasons, this process is also a little awkward but it gets smooth later. The steps involve creating an Entity Data Model, then creating a Data Service to sit on top of it, and then configuring your browser to display the data properly.

  1. Download this database script file and run it against your database.
  2. Once the database is ready, create a "WCF Service Application" in Visual Studio. Right-click on the project and add a new item. Add an "ADO.NET Entity Data Model" and call it PublisherModel.edmx.
  3. Choose "Generate from Database" and click Next.
  4. Supply it with the right server name, authentication, credentials, and the database name PublishingCompany. "Save entity connections settings in App.Config as" should be PublishingCompanyEntities.
  5. In the next dialog box, choose all of the options-tables, views, and stored procedures-and the model will be generated for you.
  6. You have now created an Entity Data Model, a representation of your database. You can now create an ADO.NET Data Service. Right click on your project, add a new item, and this time choose "ADO.NET Data Service" from the dialog.
  7. Name it PublishingDataService.svc. Visual Studio will create a service and you will be presented with a somewhat sparse page. Have a look at the TODO and the comments on the page.
  8. Change the class declaration to specify the name of the entities.
  9. public class PublishingDataService : DataService<PublishingCompanyEntities>
  10. Delete the comments from the InitializeService method and paste this in:
  11. config.SetEntitySetAccessRule("*", EntitySetRights.All);
    config.SetServiceOperationAccessRule("*", ServiceOperationRights.All);
    

What you're essentially doing here is telling the ADO.NET Data Services that you want to work with entities from the Publishing Companies database and you also want to give full permissions on everything - this is temporary, we will touch upon this later. But you're almost ready now! Press F5 and browse to PublishingDataService.svc, where you are presented with some XML.

If you observe the XML, you'll notice that these are the entities or tables that the service has made available to you. All you have to do is append the entity names to the current URL, since this is RESTful.

Change the URL to

http://localhost:49958/PublishingDataService.svc/Article

The port number assigned to this application was 49958 on my PC, which I will use in my URLs. You will need to replace it with the port number assigned to you by Visual Studio.

As soon as you go to the Article URL, instead of seeing data, you will most likely be presented with a page with almost no information on it. This is the default feed view in Firefox trying to "helpfully" render the page for you, and if you're using Internet Explorer, you will get an equivalent screen.

Do a view source of the page, and you will see that the page does contain all the XML data. The reason it's not being displayed is because the data relevant to us is inside the m:properties and d:* nodes in the XML - although the source of the page indicates that this is an actual ATOM Feed, the browsers don't know how to display this 'custom' data, so it simply hides it. To work with ADO.NET Data Services, you'll need to be able to display the XML from the feeds directly. Unfortunately, in Firefox, there isn't a straightforward way to do this, so I suggest that you use Internet Explorer for the duration of this tutorial.

In IE, go to Tools > Internet Options > Content Tab > Feeds Section > Settings button > Uncheck "Turn on feed reading view".

Now, if you reload the URL (in IE), you'll be able to see the XML from the feed and all the article data in it.

That's the end of the awkward bit, things get a lot better now. Next, you'll play with the querystring to filter and manipulate the data.

An ADO.NET Data Services Tutorial

Playing with the querystring parameters

In your browser, go back to

http://localhost:49958/PublishingDataService.svc/Article

And stare at the XML for a while. You'll notice several link nodes.

<link rel="edit" title="Author" href="Author(1)" />

<link rel="http://schemas.microsoft.com/ado/2007/08/dataservices/related/Article" 
   type="application/atom+xml;type=feed" title="Article" href="Author(1)/Article" />

<link rel="http://schemas.microsoft.com/ado/2007/08/dataservices/related/Payroll" 
   type="application/atom+xml;type=feed" title="Payroll" href="Author(1)/Payroll" />

The href properties here are indicating that you can navigate along the properties of this Author object. In the example given here, you can navigate straight to AuthorID=1 and view data only for that author. You can also go straight to the Article data for AuthorID=1 or the Payroll data for that author. Try it now, browse to these URLs

http://localhost:49958/PublishingDataService.svc/Author(1)
http://localhost:49958/PublishingDataService.svc/Author(2)/Article
http://localhost:49958/PublishingDataService.svc/Author(3)/Payroll

You can also view specific properties of individual entities via navigation. To look at just the AuthorID:

http://localhost:49958/PublishingDataService.svc/Author(1)/AuthorID

To look at Author 1's salary only:

http://localhost:49958/PublishingDataService.svc/Author(1)/Payroll(8)/Salary

Note that the Payroll ID needs to be specified as well. Appending $value to the end of that will give you an unformatted value.

http://localhost:49958/PublishingDataService.svc/Author(1)/Payroll(8)/Salary/$value

You can tell from the URLs that you're now performing GET operations in ADO.NET Data Services which corresponds to SELECT statements against the Entity Framework. It's also allowing you to navigate across the relationships between the entities in an intuitive way. At the URLs you tried above, you were able to view all the articles in a single feed document by doing an HTTP GET request. When there are thousands or millions of rows in the database, such queries become large and unwieldy and finding any specific information you want becomes difficult.

For this purpose, there are a large number of querystring expressions available that allow you to sort and filter the data you want. All of the querystring parameters specific to ADO.NET Data Services start with a $ symbol. For example, to get the top 2 articles, use $top.

http://localhost:49958/PublishingDataService.svc/Article?$top=2

You can skip a certain number of rows using $skip. Together, these are quite useful - $top and $skip can effectively be used for paging data.

http://localhost:49958/PublishingDataService.svc/Author?$top=2&$skip=1

To order by the ArticleIDs, descending, use $orderby.

http://localhost:49958/PublishingDataService.svc/Article?$orderby=ArticleID desc

To filter the Authors where the first name is Joe, use $filter.

http://localhost:49958/PublishingDataService.svc/Author?$filter=FirstName eq 'Joe'

Note that I used eq within the $filter parameter. Similarly, a numeric comparison calls for gt or lt

http://localhost:49958/PublishingDataService.svc/Author?$filter=AuthorID gt 6

Other logical operators include ne (!=), gteq (>=), and (&&), or (||), not (!). You can also perform mathematical functions in the querystring if needed such as add, sub, mul, div and mod. Remember that because the data is serialized, the field names that you put in the querystring are case sensitive. This is b no means the entire list of operators, there are more for you to play with, especially in the area of string parsing. You can have a quick glance through a full list of query functions(string, date, maths, type) and a full list of query operators (logical operators, maths). Without a doubt, $filter is a very important parameter because it supports so many operators and functions, giving you flexibility with what you query.

There's also $expand which loads the information associated with the entity that wouldn't normally be loaded. For example, when you view the Author data, you don't see Payroll, only a reference to Payroll data. You can expand it:

http://localhost:49958/PublishingDataService.svc/Author?$expand=Payroll

If you now view the XML, it will contain the Author's data as well as the Payroll data.

Playing with the querystring is fun and can provide hours of entertainment (especially on Friday nights), so go ahead and try various combinations of query string expressions right now to get a good feel of what's possible with it.

There's more you can do with the querystring, so we'll look at calling your very own custom methods in the querystring next.

An ADO.NET Data Services Tutorial

Service Operations (custom functions in the querystring)

While querystring expressions offer you lots of ways of filtering the data, not all querying logic can be represented in the querystring. If your logic is quite complicated or you need to call another method for whatever reason, you'll need to create what is known as a service operation. You can create your own functions in your code and call them using the querystring, which is very convenient. Let's do this now. In this example, we'll simply get all the authors. Open the codebehind file for PublishingDataService.svc and add this method to the class:

[WebGet]
public IQueryable<Author> GetAuthors()
{
   IQueryable<Author> query = from au in CurrentDataSource.Author
                      select au;
   return query;
}

It's not a very complicated example, but it's just to give you the idea that you could do anything in there. Few things to keep in mind about service operations:

  • You need to attribute your function with [WebGet] so that it is available via a URL (HTTP GET).
  • The return type needs to be IQueryable<T> - IQueryable<T> is much like IEnumerable<T> and is what gets used when a URL based query is performed.
  • Any arguments should be limited to the basic types such as string or int, because they need to be represented in a URL.
  • To access the Entity Framework data context, use CurrentDataSource.

Once you've added the function, build the solution and browse to it. The querystring representation for this is easy to guess:

http://localhost:49958/PublishingDataService.svc/GetAuthors

Now, add another function to the codebehind class, and this time, give it an argument.

[SingleResult]
[WebGet]
public IQueryable<Author> GetAuthorByFirstName(string firstName)
{
   IQueryable<Author> query = from au in CurrentDataSource.Author
                        where au.FirstName == firstName
                        select au;

   return query.Take(1);
}

Again, we're using WebGet and IQueryable<T> of the type we want to return. Here, we've added [SingleResult] to indicate that this function will only return 1 Author object. If you're not returning a single result, don't add that attribute. To call this new function, browse to the function name, and add the argument as a querystring parameter.

http://localhost:49958/PublishingDataService.svc/GetAuthorByFirstName?firstName='Joe'

Note how the arguments become the querystring parameters. Also note that string values must be enclosed in single quotes. Service operations therefore let you incorporate much more complicated logic but call it in a simple, straightforward way.

Next, the best bit about ADO.NET Data Services - updating, deleting and adding data in a truly RESTful way.

An ADO.NET Data Services Tutorial

Using GET, POST, MERGE and DELETE for CRUD operations

One of the sad things about REST is that of the various websites cited as being RESTful, most of them don't actually do it properly. In many cases, these sites simply expose an update method as a GET URL (updatearticle?articleid=29&title='blah'). Further, not many web servers have POST or DELETE enabled by default! Lucky for us, IIS isn't that crippled and we can perform these operations straight out of the box. Also, worthy of note, is that if you've mapped the INSERT, DELETE and UPDATE operation for an entity to stored procedures, then performing updates via HTTP verbs will also invoke those stored procedures. This will keep your DBAs happy.

To do this, though, we'll need to create a 'client' application that makes the HTTP verb calls. For this tutorial, we will create an ASP.NET web application as a client, although you can create a windows form app and follow along, the code will stay pretty much the same.

Using HTTP GET to read data

Right click on your solution, and add a new project of type ASP.NET Web Application, you can call it ADODSClient. Once Default.aspx is ready, add a huge textbox to it and two buttons.

   <asp:TextBox ID="textBoxOutput" runat="server" TextMode="MultiLine" Width="100%" Height="500"></asp:TextBox>
   <br />
   <br />
   <asp:Button ID="buttonGet" runat="server" OnClick="ButtonGet_Click" Text="Get (ATOM)" />
   <asp:Button ID="buttonGetJson" runat="server" OnClick="ButtonGetJson_Click" Text="Get (JSON)" />

I mentioned earlier that ADO.NET Data Services can expose data as an ATOM feed or as JSON. By default, it does ATOM. To get JSON, you need to set the HTTP Accept header to application/json and to get ATOM, either leave it blank or you can explicitly specify application/atom+xml. The web service will return data in the format specified in the Accept header and will send a bad status code if you specify a format that isn't supported yet.

We're going to get Article data in both formats, which is why there are two buttons; the textbox is to view the resulting data. To actually get the data out, you'll need to use an HttpWebRequest. Place this code in the codebehind file for Default.aspx

protected void ButtonGet_Click(object sender, EventArgs e)
{
   HttpWebRequest wr = (HttpWebRequest)HttpWebRequest.Create("http://localhost:49958/PublishingDataService.svc/Article");
            
   if ((wr.GetResponse().ContentLength > 0))
   {
      System.IO.StreamReader sr = new System.IO.StreamReader(wr.GetResponse().GetResponseStream());
      textBoxOutput.Text = sr.ReadToEnd();
   }
}

Build, and run the web application. You will also need the data service project running so that HttpWebRequest can access it and request data. In the code above, the .Method property of HttpWebRequest hasn't been specified - it is GET by default. The Accept header has not been specified either - the web server will default to ATOM. When you click the ATOM button, the textbox gets populated with the XML returned. In a real world application, you would either load the string into an XMLDocument and look for the nodes you want, or you would request the URL to the property or entity that you want directly.

Now let's have a look at what JSON looks like. Handle the other button's click event; the code is exactly the same as above, except for the Accept property now being specified.

protected void ButtonGetJson_Click(object sender, EventArgs e)
{
   HttpWebRequest wr = (HttpWebRequest)HttpWebRequest.Create("http://localhost:49958/PublishingDataService.svc/Article");
   wr.Accept = "application/json";
   if ((wr.GetResponse().ContentLength > 0))
   {
      System.IO.StreamReader sr = new System.IO.StreamReader(wr.GetResponse().GetResponseStream());
      textBoxOutput.Text = sr.ReadToEnd();
   }
}

If you build and run it now, the second button produces a JSON output in the textbox.

Using HTTP POST to INSERT data

When it comes to modifying or inserting data, the actual body of the request becomes important. When doing a POST, the body must be in either a JSON or ATOM format and must represent a node as you'd see in the GET request. To show you with an example, browse to and view source of

http://localhost:49958/PublishingDataService.svc/Author

The <m:properties> node contains all the data specific to the Author being displayed. The actual data is in nodes prefixed with a 'd' and the whole thing is contained inside a <content> node inside an <entry> node. This is what we will need to send in the POST body.

Add a new page to the web application, call it AddAuthor.aspx. Give it a textbox for FirstName, a textbox for LastName and another large textbox to write the result out to. Also, add two buttons to the page - we will perform a POST in both the ATOM and JSON formats.

First Name: <asp:TextBox ID="textBoxFirstName" runat="server"></asp:TextBox>
<br />
Last Name: <asp:TextBox ID="textBoxLastName" runat="server"></asp:TextBox>
<br />
<asp:Button ID="buttonCreateAtom" runat="server" OnClick="ButtonCreateAtom_Click" Text="Create (ATOM)" />
<asp:Button ID="buttonCreateJson" runat="server" OnClick="ButtonCreateJson_Click" Text="Create (JSON)" />
<br /><br />
<asp:TextBox ID="textBoxResult" runat="server" TextMode = "MultiLine" Width="100%" Height="500" />

In the click event for the ATOM button, start by creating the payload by concatenating values from the FirstName and LastName textboxes.

string payload = String.Concat("<entry  xmlns:d=\"http://schemas.microsoft.com/ado/2007/08/dataservices\"",
                                "xmlns:m=\"http://schemas.microsoft.com/ado/2007/08/dataservices/metadata\"",
                                "xmlns=\"http://www.w3.org/2005/Atom\">",
                                   "<content type=\"application/xml\">",
                                      "<m:properties>",
                                         "<d:FirstName>",textBoxFirstName.Text,  "</d:FirstName> ",
                                         "<d:LastName>",textBoxLastName.Text,"</d:LastName>",
                                      "</m:properties>",
                                   "</content>",
                                "</entry>");

The entry node also includes the namespaces, but the properties node does not contain a <d:AuthorID> node, since we are performing an INSERT. The new ID will be part of the result. Next, an HttpWebRequest, but set the .Method property to POST. Further, the Accept header needs to be set as does the Content-Type so that we can specify the format we are submitting in and expecting in the result.

HttpWebRequest wr = (HttpWebRequest)HttpWebRequest.Create("http://localhost:49958/PublishingDataService.svc/Author");
wr.Method = "POST";
wr.Accept = "application/atom+xml";
wr.ContentType = "application/atom+xml";

Because this is a POST, the payload data needs to be written to the request stream as binary, so a UTF8Encoding object is used to get the binary data out. Write it out, then get a response and write results to the textbox.

UTF8Encoding encoding = new UTF8Encoding();
wr.ContentLength = encoding.GetByteCount(payload);

using (Stream requestStream = wr.GetRequestStream())
{
   requestStream.Write(encoding.GetBytes(payload), 0, encoding.GetByteCount(payload));
}

HttpWebResponse response = wr.GetResponse() as HttpWebResponse;
using (Stream responseStream = response.GetResponseStream())
{
   using (StreamReader reader = new StreamReader(responseStream))
   {
      textBoxResult.Text = reader.ReadToEnd();
   }
}

Build and load the page, fill in some values for the new author's first and last name, and click the Create (ATOM) button. The result textbox should fill up with XML in ATOM format, of which one of the nodes is <d:AuthorID>, containing the AuthorID of our new author.

To do this with the JSON format, handle the other button's click event, and use this code:

string payload = String.Concat( "{__metadata:{\"Uri\":\"/Author/\", ",
        "\"Type\":\"PublishingModel.Author\"}, " ,
        " \"FirstName\":\"",textBoxFirstName.Text, 
        "\", \"LastName\": \"", textBoxLastName.Text ,"\"",
        " })");

HttpWebRequest wr = (HttpWebRequest)HttpWebRequest.Create("http://localhost:49958/PublishingDataService.svc/Author");
wr.Method = "POST";
UTF8Encoding encoding = new UTF8Encoding();
wr.ContentLength = encoding.GetByteCount(payload);
wr.Accept = "application/json";
wr.ContentType = "application/json";

using (Stream requestStream = wr.GetRequestStream())
{
    requestStream.Write(encoding.GetBytes(payload), 0, encoding.GetByteCount(payload));
}

HttpWebResponse response = wr.GetResponse() as HttpWebResponse;
using (Stream responseStream = response.GetResponseStream())
{
    using (StreamReader reader = new StreamReader(responseStream))
    {
        textBoxResult.Text = reader.ReadToEnd();
    }
}

Again, load the page and click the JSON button - the output is in JSON and contains the AuthorID of the newly created author. The JSON format is not exactly readable and working with it is actually quite painful. You will spend most of your time figuring out where to put the quotes and braces and still get it wrong, but it has been included here as an example to demonstrate how it can be done. You would typically use the JSON format if you're working extensively with third party Javascript libraries or dealing heavily with AJAX in your application. For the rest of this tutorial, however, I will only use ATOM. If you wish to use JSON, then that's left as an exercise to you.

Using HTTP MERGE to UPDATE data

Once you've got your head around POST, MERGE is easy enough. The same principles apply - only send what needs to be sent. In the case of POST, it was everything except for the ID field. In the case of MERGE, you need to send only the fields that need updating. The ID field will not change, so that will not be sent. Instead, like before, you are performing the operation against a specific entity, which means that the URL requested will represent the Article that you will update in this example, such as /Article(23).

To make things a bit more interesting here, let's also add some requirements to the page. The page should first let the user choose from a set of articles. Next, let the user load the article, modify it and then click Update. Don't groan, you're doing this to combine bits of the knowledge gained so far in this tutorial.

Add a new page to the web project, call it UpdateArticle.aspx. It needs a DropDownList for the articles, a load button, two textboxes (title, body) and an update button.

<asp:DropDownList ID="ddlTitles" runat="server">
</asp:DropDownList><asp:Button ID="buttonLoadArticle" runat="server" OnClick="ButtonLoadArticle_Click" Text="Load" />
<br />
<br />
Title: <asp:TextBox ID="textBoxTitle" runat="server" Width="500"></asp:TextBox><br />
Body: <asp:TextBox ID="textBoxBody" runat="server" TextMode="MultiLine" Width="100%" height="500"></asp:TextBox><br />
<br /><br />
<asp:Button ID="buttonUpdate" runat="server" Text="Update" OnClick="ButtonUpdate_Click" />

First step - populate the DropDownList with article titles. This is a simple GET request, followed by some XML parsing

protected void Page_Load(object sender, EventArgs e)
{
    if (!Page.IsPostBack)
    {
        //1. Load the DDL with values.
        //Get against the Article entities 
        string articleXml = String.Empty;
        HttpWebRequest wr =
            (HttpWebRequest) HttpWebRequest.Create("http://localhost:49958/PublishingDataService.svc/Article");

        if ((wr.GetResponse().ContentLength > 0))
        {
            System.IO.StreamReader sr = new System.IO.StreamReader(wr.GetResponse().GetResponseStream());
            articleXml = sr.ReadToEnd();
        }

        XmlDocument doc = new XmlDocument();
        doc.InnerXml = articleXml;

        //To handle the XML namespaces
        XmlNamespaceManager nsmgr = new XmlNamespaceManager(doc.NameTable);
        nsmgr.AddNamespace("base", "http://localhost:49958/PublishingDataService.svc/");
        nsmgr.AddNamespace("d", "http://schemas.microsoft.com/ado/2007/08/dataservices");
        nsmgr.AddNamespace("m", "http://schemas.microsoft.com/ado/2007/08/dataservices/metadata");
        nsmgr.AddNamespace("def", "http://www.w3.org/2005/Atom");

        foreach (XmlNode idNode in doc.SelectNodes("/def:feed/def:entry", nsmgr))
        {
            string id = idNode.SelectSingleNode("./def:content/m:properties/d:ArticleID", nsmgr).InnerText;
            string title = idNode.SelectSingleNode("./def:content/m:properties/d:Title", nsmgr).InnerText;
            ddlTitles.Items.Add(new ListItem() {Text = title, Value = id});
        }
    }
}

What's not nice about this is that all of the Article data is requested. All we really want here is the ID and the Title of the Articles, but we're getting the Body nodes as well. There isn't an easy way to only request specific nodes in this version of ADO.NET Data Services, so we'll have to live with this for now.

Load and run the page, and the DropDownList should now be populated with the article titles. Moving on to the load button next, it's the same process as before - request an article specific ID, parse the XML that comes back.

string articleId = ddlTitles.SelectedValue;
string title = String.Empty;
string body = String.Empty;
            
HttpWebRequest wrBody = (HttpWebRequest)HttpWebRequest.Create
   ("http://localhost:49958/PublishingDataService.svc/Article(" + articleId + ")/Title/$value");

if ((wrBody.GetResponse().ContentLength > 0))
{
   System.IO.StreamReader sr = new System.IO.StreamReader(wrBody.GetResponse().GetResponseStream());
   title = sr.ReadToEnd();
}


HttpWebRequest wrBody = (HttpWebRequest)HttpWebRequest.Create
   ("http://localhost:49958/PublishingDataService.svc/Article(" + articleId + ")/Title/$value");
wrBody.CachePolicy = new HttpRequestCachePolicy(HttpRequestCacheLevel.NoCacheNoStore);
if ((wrBody.GetResponse().ContentLength > 0))
{
   System.IO.StreamReader sr = new System.IO.StreamReader(wrBody.GetResponse().GetResponseStream());
   title = sr.ReadToEnd();
}

textBoxTitle.Text = title;
textBoxBody.Text = body;

Instead of getting all the data back this time, we made two HttpWebRequests to get each value out individually using $value. Again, run the page, choose a title, and press load. The textboxes should get populated with the article title and body.

To update, the process is just like POST, only that the .Method property is different, as are the payload contents. Oh, and we're going to be lazy and write the output to the body textbox.

protected void ButtonUpdate_Click(object sender, EventArgs e)
{
   string payload = String.Concat("<entry  xmlns:d=\"http:/<span>/</span>schemas.microsoft.com/ado/2007/08/dataservices\"",
                                            "xmlns:m=\"http:/<span>/</span>schemas.microsoft.com/ado/2007/08/dataservices/metadata\"",
                                            "xmlns=\"http:/<span>/</span>www.w3.org/2005/Atom\">",
                                             "<content type=\"application/xml\">",
                                              "<m:properties>",
                                                  "<d:Title>", textBoxTitle.Text, "</d:Title> ",
                                                  "<d:Body>", textBoxBody.Text, "</d:Body>",
                                              "</m:properties>",
                                             "</content>",
                                         "</entry>");


   HttpWebRequest wr = (HttpWebRequest)HttpWebRequest.Create("http://localhost:49958/PublishingDataService.svc/Article(" + ddlTitles.SelectedValue + ")");
   wr.Method = "MERGE";
   UTF8Encoding encoding = new UTF8Encoding();
   wr.ContentLength = encoding.GetByteCount(payload);
   wr.Accept = "application/atom+xml";
   wr.ContentType = "application/atom+xml";

   using (Stream requestStream = wr.GetRequestStream())
   {
      requestStream.Write(encoding.GetBytes(payload), 0, encoding.GetByteCount(payload));
   }

   HttpWebResponse response = wr.GetResponse() as HttpWebResponse;
   using (Stream responseStream = response.GetResponseStream())
   {
      using (StreamReader reader = new StreamReader(responseStream))
      {
         textBoxBody.Text = response.StatusCode + "," + response.StatusDescription;
      }
   }
}

That's it, run the code and try updating the article. You'll see the textbox populated with "NoContent" - this indicates that the request was successful.

Using HTTP DELETE to DELETE data

DELETE is a simple operation. You don't need to send a payload across, just perform an HTTP DELETE request against the entity's URL. Because we're now being lazy, add a button next to the DropDownList on UpdateArticle.aspx.

<asp:DropDownList ID="ddlTitles" runat="server"></asp:DropDownList>
<asp:Button ID="buttonLoadArticle" runat="server" OnClick="ButtonLoadArticle_Click" Text="Load" />
<asp:Button ID="buttonDelete" runat="server" OnClick="ButtonDelete_Click" Text="DELETE!" />
...

Handle its click event and again, write the output to the textbox.

HttpWebRequest wr = (HttpWebRequest)HttpWebRequest.Create
   ("http://localhost:49958/PublishingDataService.svc/Article(" + ddlTitles.SelectedValue + ")");
wr.Method = "DELETE";

HttpWebResponse response = (HttpWebResponse)wr.GetResponse();
textBoxBody.Text = response.StatusCode.ToString();

And there you go - you've implemented CRUD operations using HTTP verbs and have officially earned bragging rights. Next, we'll talk about Linq and how it can make working with ADO.NET Data Services much, much easier.

An ADO.NET Data Services Tutorial

Using Linq to Data Services

Now I give you the bit of news that's going to make you want to hit me. Everything we did in the previous section could have been done with a few simple lines of code involving Linq and Entity Expressions. The reason I made you go through all of that was for you to understand how interacting with ADO.NET Data Services work at the most fundamental level involving requests across the wire.

In the previous section, we didn't set any service references to the Publishing service, all we did was RESTfully reference the URL for the entities we wanted. In order to use Linq to Data Services, however, you need to set a service reference. Because ADO.NET Data Services are actually WCF services, a service reference from your solution is possible because the metadata is available. Go ahead and do that now.

Right click on your project and add service reference. Give it the URL to your Publishing Data Service. If you expand the entities node, you can see the entities available to you.

[wcf_reference.jpg]

This is what you will be using in your client application. We'll now revisit the pages we made and do the same things, this time using the service reference.

Using Linq to read data

Go to Default.aspx and add a new button with a click event to it.

<asp:Button ID="buttonGetLinq" runat="server" OnClick="ButtonGetLinq_Click" Text="Get LINQ" />

The entities collection referenced is what serves as our service here. Its constructor takes the URL to the data service that we're using.

protected void ButtonGetLinq_Click(object sender, EventArgs e)
{

   PublishingCompanyEntities service = new PublishingCompanyEntities
             (new Uri("http://localhost:49958/PublishingDataService.svc"));

   var restQuery = from ar in service.Article
                            select ar;

   Response.Write(restQuery.ToString());


   foreach (Article ar in restQuery)
   {
      textBoxOutput.Text = textBoxOutput.Text + ar.Title;
   }
}

There's a reason for doing the restQuery.ToString() as you'll see in a moment - build and run the page, then click on the new Linq button to see what happens. The textbox is populated with titles from the articles, and you also see a URL written to the page.

What's happening here? Although you've constructed a Linq query to get the data, the query gets converted into a URL with query parameters behind the scenes to facilitate your request. This means that any valid Linq query you write in there will get converted to a URL and an HttpWebRequest will be used behind the scenes to get your data back. You can think of it as a Linq-To-Rest feature, of sorts, but remember that the operations you perform are specifically limited to what's available in the querystring operations that we talked about earlier.

Go ahead and try a few more Linq queries in place of the existing one, try filtering and ordering the data you get back, and look at the URL that is written.

var restQuery = (from ar in service.Article
                             where ar.Title.Contains("a")
                             select ar).Skip(1).Take(1);
var restQuery = from ar in service.Article
                            where ar.Author.FirstName == "Joe"
                            select ar;
var restQuery = from au in service.Author
                            where au.Article.Count > 2
                            select au;   //Should throw exception

Not all of these queries will work. The last query above will break because no related data has been loaded for that author and so an exception will be returned from the web service. So you have to be careful when using this and to ensure that you're not doing something that the querystring operations cannot map to.

Using Linq to insert data

Yep, there's an easy way to insert data too. This method actually uses Entity Expressions and is quite simple. Add a new button to NewAuthor.aspx for our Linq operation

<asp:Button ID="buttonCreateLinq" runat="server" OnClick="ButtonCreateLinq_Click" Text="Create (LINQ)" />

Now, we simply create a new Author object, set its properties and call the Entities' AddToAuthor() method.

protected void ButtonCreateLinq_Click(object sender, EventArgs e)
{
   PublishingCompanyEntities service = new PublishingCompanyEntities
       (new Uri("http://localhost:49958/PublishingDataService.svc"));
   int newAuthorId = -1;
   Author newGuy = Author.CreateAuthor(newAuthorId);
   service.AddToAuthor(newGuy);
   service.SaveChanges();

   textBoxResult.Text = "New author is: " + newGuy.AuthorID.ToString();
}

Just like operations in the Entity Framework, the new Author object's ID property is populated after a successful call to SaveChanges().

Using Linq to update data

On the UpdateArticle.aspx page, add a new button for our Linq operation.

<asp:Button ID="buttonUpdateLinq" runat="server"  Text="Update (LINQ)" OnClick="ButtonUpdateLinq_Click" />

To update an Article, you must first get it from the entities collection, update its properties and then pass it to the UpdateObject() method so that it is marked for an update. Don't forget to call SaveChanges().

protected void ButtonUpdateLinq_Click(object sender, EventArgs e)
{
   PublishingCompanyEntities service = new PublishingCompanyEntities
       (new Uri("http://localhost:49958/PublishingDataService.svc"));
   Article currentArticle = (from ar in service.Article
                             where ar.ArticleID == Convert.ToInt32(ddlTitles.SelectedValue)
                              select ar).First();

   currentArticle.Body = textBoxBody.Text;
   currentArticle.Title = textBoxTitle.Text;

   service.UpdateObject(currentArticle);
   service.SaveChanges();
}

Using Linq to delete data

Are you liking this so far? Delete works on the same principle - get the object, then pass it to the DeleteObject() method before calling SaveChanges().

Again, on UpdateArticle.aspx, add a button for this delete operation.

<asp:Button ID="buttonDeleteLinq" runat="server" OnClick="ButtonDeleteLinq_Click" Text="DELETE! (LINQ)" />

And in the codebehind

protected void ButtonDeleteLinq_Click(object sender, EventArgs e)
{
   PublishingCompanyEntities service = new PublishingCompanyEntities
       (new Uri("http://localhost:49958/PublishingDataService.svc"));
   Article currentArticle = (from ar in service.Article
                             where ar.ArticleID == Convert.ToInt32(ddlTitles.SelectedValue)
                             select ar).First();

   service.DeleteObject(currentArticle);
   service.SaveChanges();
}

And there you go - all the CRUD operations, simplified. Learning the HTTP verbs was not a waste of time, though. Just like learning Linq to Sql without knowledge of ADO.NET is a bad idea, so is learning Linq to Data Services without knowing what's going on behind the scenes - it is that knowledge that will help you troubleshoot problems later.

Next, the ability to intercept these operations on the service!

An ADO.NET Data Services Tutorial

Intercepting Data Service requests

ADO.NET Data Services lets you hook into the processing pipeline whenever a query occurs. You could look at a read request and change some of the data just before it is sent to the client. You could look at an update request coming in and perform some validation on it just before it is sent to the database. These are known as Query Interceptors and Change Interceptors.

Query Interceptors

Query Interceptors let you modify the data when the data for a read request is about to be sent to the client. This is done on a per-entity basis. So to be more specific, you could look at a query coming in against Authors, and limit the results to only those authors that have 2 or more articles against them.

Go to the codebehind for PublishingDataService. Add a new method there, any name, and attribute it with QueryInterceptor("Author"). The act of attributing the method is enough to have it called, because the service will call all query interceptors before it sends the data back. You must, however, ensure that the return type is Expression<Func<Author,bool>>. The Expression is essentially a representation of a lambda expression which returns a boolean. This whole thing is better explained with an example.

[QueryInterceptor("Author")]
public Expression<Func<Author, bool>> AuthorFilter()
{
    return au => au.Article.Count > 2;
}

We're returning authors where the author's article count exceeds 2. Set a breakpoint in that code, build the service, and browse to the Author entity.

http://localhost:49958/PublishingDataService.svc/Author

The breakpoint should be hit and results filtered when you allow it to continue! Of course you can perform more complex operations in that function, but the point of the interceptors are that you can inject business logic into the pipeline.

Change Interceptors

You can also insert some business logic and modify the data before an update, delete or insert happens. As with Query Interceptors, each Change Interceptor only work on one entity type. The change interceptor method takes two arguments - an instance of the entity being modified, and an enumeration representing the operation being performed. This means that a single change interceptor can handle update, delete and insert. It requires a ChangeInterceptor("Author") attribute added to the method with the entity type as its constructor.

For the sake of an example, I would like to prevent any title for an updated article from being too long.

[ChangeInterceptor("Article")]
public void ChangeArticle(Article article, UpdateOperations operation)
{
   switch(operation)
   {
       case UpdateOperations.Add:
                    break;
       case UpdateOperations.Delete:
                    break;
       case UpdateOperations.Change:
           {
               if (article.Title.Length > 30)
               {
                   throw new DataServiceException("Title is too long.");
               }
               break;
           }
       default:
           {
               break;
           }
   }
}

You could also just truncate the Title string, and the changes will get persisted to the database. I've left the Add and Delete operations in to show you the enumerations that can be used.

Next, a security topic - setting access to entities, and how to deal with exceptions and tracing in the service.

An ADO.NET Data Services Tutorial

Entity and Service Operations Access

Earlier in this article, I told you to set a few lines of code in the InitializeService method of the data service.

config.SetEntitySetAccessRule("*", EntitySetRights.All);
config.SetServiceOperationAccessRule("*", ServiceOperationRights.All);

By now the method names should be making more sense. The first line sets access rules on entities, the second line sets access rules on your custom methods. In the first line, we have used * which means 'all entities' and EntitySetRights.All which means that GET, POST, MERGE and DELETE can be performed against all entities. This isn't usually what you want. For example, even though a UserNamePassword table could exist in your database, you don't want it exposed via the web service. You might also want to allow reads on Payroll data, but no updates. Here's how you would do it:

config.SetEntitySetAccessRule("Payroll",EntitySetRights.AllRead);

config.SetEntitySetAccessRule("UserNamePassword", EntitySetRights.None);

There are also other enumerations of EntitySetRights available such as restricting users to reading single entities rather than collections of entities, and restricting them to various CRUD operations. Play around with it in the InitializeService method and watch other methods break as you change the permissions against different entities.

Similarly, the config.SetServiceOperationAccessRule method lets you restrict access to certain methods that you may or may not want visible in the service. The * means all methods, but you could always only allow just one method to be visible. Taking our service operations from before as candidates,

config.SetServiceOperationAccessRule("GetAuthorByFirstName", ServiceOperationRights.ReadSingle);

This sets the GetAuthorByFirstName method to be visible and only allows users to read a single item of data from it.

Exceptions and Tracing

By default, exception details in ADO.NET Data Services are turned off and so if you get an exception, all you get is a blue and white page which I like to call the RESOD - the Reference Error Screen Of Death. The page is sparse and misleading and you run around in circles wondering where the exception could have been logged - the answer is nowhere.

[request_error.jpg]

The problem here is that nothing has been written to server logs. It's only saying that in case you've got tracing on. However, before I show you how to trace, you can enable verbose errors in the application to see what went wrong. This would be useful in a development environment where you want to see what's happening as you make changes.

To allow for verbose errors, you need to set IncludeExceptionDetailInFaults for the service at the class level You won't find any references to the data service in the web.config file, instead, you'll need to add this attribute to the class definition of the service:

[ServiceBehavior(IncludeExceptionDetailInFaults = true)]
public class PublishingDataService : DataService
//...

Next, override the HandleException method in the service class and explicitly tell it to use verbose errors.

protected override void HandleException(HandleExceptionArgs e)
{
   e.UseVerboseErrors = true;
}
[verbose_err.jpg]

Now, the RESOD becomes more helpful and you get a full stack trace and error messages from the service. However, this isn't something you want to do when deploying the service to production. You could get the HandleException method to log it to your own custom location, or perform a trace. And besides, as you can see from the screenshot above, the verbose errors are not always helpful.

Your best bet tracing. To enable tracing, you need to add just a few lines of code to the web.config file (just after configSections).

<system.diagnostics>
   <sources>
      <source name="System.ServiceModel.MessageLogging" switchValue="Warning, ActivityTracing" >
         <listeners>
            <add name="ServiceModelTraceListener"/>
         </listeners>
      </source>

      <source name="System.ServiceModel" switchValue="Verbose,ActivityTracing"                >
         <listeners>
            <add name="ServiceModelTraceListener"/>
         </listeners>
         </source>
         <source name="System.Runtime.Serialization" switchValue="Verbose,ActivityTracing">
            <listeners>
               <add name="ServiceModelTraceListener"/>
            </listeners>
         </source>
   </sources>
   <sharedListeners>
      <add initializeData="App_tracelog.svclog"
         type="System.Diagnostics.XmlWriterTraceListener, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"
         name="ServiceModelTraceListener" traceOutputOptions="Timestamp"/>
   </sharedListeners>
</system.diagnostics>

When you restart the service and run a query, a .svclog file gets created in the same directory as the service. Careful though, this file grows fast with each request processed, so get in, get your error, and disable tracing as fast as you can. You can then use the Service Trace Viewer application to see what's happening with the service, including messages and exceptions. Simply double click on the .svclog file on your machine and it will open up.

[wcf_tracing.jpg]

This usually lets you see the execution flow and see what went wrong at what point in the application.

Conclusion

That's pretty much it about ADO.NET Data Services, I hope this has been useful. As with any technology, you must always evaluate it and see if it fits your needs before using it - there are times and places where ADO.NET Data Services can be inappropriate and places where it fits very well. It's not the best when it comes to very complicated business logic, but it's good when you want a straightforward RESTful web service solution that's easily accessible and easy to use with a few minor changes made to it. Examples of where I'd use this technology are blog feeds, logging services and APIs for developers trying to interface with a site.

You can learn more about ADO.NET Data Services at the Developer Center pages for ADO.NET Data Services.



About the Author

SM Altaf

Mendhak is a web developer and a Microsoft MVP who works with ASP.NET and PHP among the usual array[] of web technologies. He is also rumored to be a three eyed frog, but the evidence is lacking. He can be contacted via his website, www.mendhak.com.


Downloads

Comments

  • There are no comments yet. Be the first to comment!

Leave a Comment
  • Your email address will not be published. All fields are required.

Top White Papers and Webcasts

  • Live Event Date: October 29, 2014 @ 11:00 a.m. ET / 8:00 a.m. PT Are you interested in building a cognitive application using the power of IBM Watson? Need a platform that provides speed and ease for rapidly deploying this application? Join Chris Madison, Watson Solution Architect, as he walks through the process of building a Watson powered application on IBM Bluemix. Chris will talk about the new Watson Services just released on IBM bluemix, but more importantly he will do a step by step cognitive …

  • Today's "average" business in general is ever more reliant on technology and the Internet. Mobility is the most often cited business trend that has transformed the way many of us work and communicate. From an IT security perspective, this means that protection methods and tools from even a few years ago are rapidly becoming "unfit for purpose." This guide provides crucial facts to assist you in building a robust business case, meeting the demands of your business, and protecting against threats now and in the …

Most Popular Programming Stories

More for Developers

Latest Developer Headlines

RSS Feeds