What ADO.NET Teaches Us about Custom Objects

I have met very few developers or development teams who prefer ADO.NET objects to custom objects. Similarly, many are writing their own custom data access layers. This should convey a loud and clear message to Microsoft and other software vendors: ADO.NET is great, but developers prefer custom objects. So, why not give us more technologies that help us build software the way we actually build software?

I can't answer the last question, but I can tell you how to consistently take the guesswork out of building database-centric applications (which account for something like 80 percent of applications). How? Well, believe it or not, Microsoft's ADO.NET pattern shows how.

Everyone Writes Custom Objects (or What Bugs Me about ADO.NET DataSets)

ADO.NET is a great technology, so don't flame me. In fact, here are some reasons why ADO.NET is good:

  • ADO.NET, specifically DataSets and DataTables, have good code-generation support, are stored as XML, and make it very easy to get data presented very quickly.
  • ADO.NET objects have XML Web services support.
  • With partial classes, DataSets can be extended and regenerated (and that's a good thing).
  • DataSets (and DataTables) already know how to keep track of changes to fields.

On the surface, these are pretty good reasons to use DataSets and DataTables. I—and I think many of you—just don't want to drag ADO.NET DataSets (from now on, assume I mean DataTables, too) all over the application. Database objects should live by the database, not in the client or the business layer. That, along with the following reasons, may explain why so many developers write custom objects:

  • Programmers have significantly more control over custom objects. This is probably the most compelling reason.
  • Generally, programmers don't want a one-to-one correlation between database fields and class fields. It is smart to hide a lot of these details.
  • Normalized data tables make terrible objects.
  • Custom objects are much lighter in weight.
  • Custom objects will be built anyway, whether programmers are using a database or not.
  • Middleware and clients are much lighter without ADO.NET.
  • Custom objects can be returned from XML Web services (or ASP.NET Web services, if you prefer).
  • It also is possible to write a one-size-fits-all data access layer that can be used in any data-centric application.
  • With CodeDOM or CodeRush, programmers can whip up custom objects very, very quickly. (A 500-line business object can be generated with CodeRush in minutes. Write a CodeDOM generator, and dozens or hundreds of these objects can be spit out in the blink of an eye.)

The key to really winning with custom objects is to use code generators—like CodeRush or roll-your-own—and follow ADO.NET's example by separating persistence from the objects themselves. One of the worst decisions I ever made was trying to make complex .NET custom objects self-persisting. Some really good book on object-oriented design touts this as a good thing—it's not. Factoring common code out into separate classes works better. (Or so I think right now.)

Understanding the Lesson of ADO.NET

What did Microsoft do with ADO.NET? They factored out persistence from DataSets and DataTables, using adapters and commands to do the actual reading and writing. Why did they do this? Read and write code would otherwise have to be repeated in both DataSets and DataTables, which would make them even more bloated and require creating a flavor of DataSet and DataTable for every database provider. This would be very bad indeed. By factoring out reading and writing, Microsoft permits other vendors to implement their own providers—the persistence capability.

The key, then, is to separate your persistence layer from your business objects.

Separating read and write from business objects

You easily can separate reading and writing from custom business objects by creating a general data access layer that accepts stored procedure names, an array of parameters, and an event handler. The event handler lets you vary how the return data or result set is actually read from readers. By returning an object, or better yet defining the data access layer using generics, you can use one data access layer to read and write any object. In fact, this is pretty close to what Microsoft did with the data access enterprise library pattern (formerly called the DAAB).

Using stored procedures

You don't have to use stored procedures, but you may want to simply because once you do, it is a very easy next step to separate the C# or VB .NET programming from the database programming. DBAs are often better at this anyway, and using stored procedures makes a very clear delineation between areas of responsibility.

Leave ADO.NET Where It Belongs

How you separate persistence from your business objects doesn't matter as much as it does that you do at all. Generics and delegates make it very easy to completely isolate your business objects, middleware, and clients from ADO.NET.

Combining custom objects with great—and I mean great—tools such as CodeRush enables you to write a pretty thin data access layer, create custom objects with their incumbent flexibility, and leave ADO.NET in the data access layer where it belongs.

P.S. If you want me to write an article demonstrating this technique or at least my approach, send me an email or post a response. If you think it's hooey, let me know why.

About the Author

Paul Kimmel is the VB Today columnist for www.codeguru.com and has written several books on object-oriented programming and .NET. Check out his new book UML DeMystified from McGraw-Hill/Osborne. Paul is an architect for Tri-State Hospital Supply Corporation. You may contact him for technology questions at pkimmel@softconcepts.com.

If you are interested in joining or sponsoring a .NET Users Group, check out www.glugnet.org.

Copyright © 2006 by Paul T. Kimmel. All Rights Reserved.



Comments

  • Great! Interesting....we'd like more

    Posted by Magallo on 12/19/2006 03:53am

    Very interesting. I would love an article demostrating all the stuff in practice...

    Reply
  • Frodo

    Posted by sla on 12/01/2006 11:03am

    More details and bigger article would be nice ...

    Reply
  • Great article...more please

    Posted by Saruman on 11/29/2006 09:57am

    I found this article very useful but I'm left wanting more details. I'd really like to see you write another article giving the specifics of your approach.

    • More would be great

      Posted by jfcantin on 11/29/2006 11:55am

      An example of your approach would be awesome.

      Reply
    Reply
Leave a Comment
  • Your email address will not be published. All fields are required.

Top White Papers and Webcasts

  • Live Event Date: December 11, 2014 @ 1:00 p.m. ET / 10:00 a.m. PT Market pressures to move more quickly and develop innovative applications are forcing organizations to rethink how they develop and release applications. The combination of public clouds and physical back-end infrastructures are a means to get applications out faster. However, these hybrid solutions complicate DevOps adoption, with application delivery pipelines that span across complex hybrid cloud and non-cloud environments. Check out this …

  • With the average hard drive now averaging one terabyte in size, the fallout from the explosion of user-created data has become an overwhelming volume of potential evidence that law-enforcement and corporate investigators spend countless hours examining. Join Us and SANS' Rob Lee for our 45-minute webinar, A Triage and Collection Strategy for Time-Sensitive Investigations, will demonstrate how to: Identify the folders and files that often contain key insights Reduce the time spent sifting through content by …

Most Popular Programming Stories

More for Developers

RSS Feeds