A Somewhat Unusual Choice in the .NET Platform
As you probably remember, I started this series of articles by saying that although Microsoft often pushes DataSets for data containers and says that custom classes often give bad performance, I like custom classes a lot. I used to think the same way Microsoft didfor example, I followed that philosophy when I wrote my book .NET Enterprise Design with Visual Basic .NET and SQL Server 2000 (Sams, 2001), but I changed my mind when I started to investigate using a classic object-oriented domain model.
I previously believed in not using custom classes for the domain model mostly because of a lot of practical problems with it in the old world of COM and COM+. For example, it was pretty hard to write Marshalling By Value (MBV) components in COM, and it was undoable with VB6. We had to go for another language, typically C++. It was also pretty expensive to instantiate objects. In the case of COM+, configuring domain classes made the context overhead very expensive because it's not atypical to work with a couple of hundred domain objects in one request. With that knowledge, my first choice was to continue going for a data-centric approach in .NET and to use DataSets.
After a while, I wanted to try a more classic object-oriented approach in .NET. My idea was that I could stand a small overhead with an object-oriented domain model approach because I saw other advantages with it, such as chances for higher maintainability. Therefore, I executed some quick and dirty tests to find out about the amount of overhead. To my surprise, my initial tests showed a lower overhead for my object-oriented domain model approach.
After thinking some more about it and investigating it a bit more, I began thinking that it's quite natural for the object-oriented domain model to show the possibility of good performance. For example, DataSets are using many DataRow objects, too.
Regarding the three examples of problems with an object-oriented domain model in COM, those problems are actually nonproblems in .NET. At first, if you consider writing MBV components in .NET, the answer is simple (at least for simple cases): Just use the <Serializable()> attribute for the class. There is a lot more to serialization for advanced situations, but quite often using that attribute is enough.
Regarding the overhead for instantiation, which is very important if you need to create many objects, it's much smaller in .NET than for COM and VB6. Try it yourself by executing the code in Listings 1 and 2. The code in Listing 1 is for VB6, and the code in Listing 2 is for .NET. According to my quick and dirty tests, the .NET code is executed approximately 100 times faster. Normally (and hopefully), you do a lot more in an application than instantiating objects, but a difference this big will be noticeable in many applications.
Listing 1: VB6 Code for Instantiating 10 Million Dummy Objects
Dim i As Long Dim theTest As Test For i = 0 To 10000000 Set theTest = New Test theTest.DoStuff Next
Listing 2: VB.NET Code for Instantiating 10 Million Dummy Objects
Dim i As Integer = 0 Dim theTest As Test For i = 0 To 10000000 theTest = New Test() theTest.DoStuff() Next GC.Collect()
Finally, the context overhead from configuring domain classes in COM+ is, in my opinion, more of a misuse of COM+ than a problem with the technology itself. If you deal with the configuration aspect on a service-layer level (or a business facadelayer level or application-layer level, or whatever you call it), you will find that what goes on "behind" that layer gets much smaller instantiation and call overhead compared to configuring the domain model classes.
You will find much coverage of this subject in my book .NET Enterprise Design with Visual Basic .NET and SQL Server 2000 (Sams, 2001).
It now has been almost a year since my initial tests with a classic object-oriented domain model in .NET, but better late than never to write about the design implications for choosing data containers for .NET, right?