Home > Articles > Programming > Windows Programming

  • Print
  • + Share This
This chapter is from the book

Application Scenario

With the fundamentals of accessing data through managed providers and returning data through a DataSet out of the way, we can now begin to focus on putting the ADO.NET components in their proper context. The code developed in the remainder of this chapter includes an application architecture and data access techniques used in developing an online Quilogy Education system.

One of the key features of the example application is the ability for students to access information about Quilogy education offerings through a public Web site. Several features of the site include the ability to query for course schedules, enroll in a class, and log on to the site with a password in order to view transcripts. Obviously each of these three actions involve either collecting data from the user services tier of the application and sending it to the business and data services for processing, or querying the database for data and passing it back to the user services for presentation. Both of these scenarios lend themselves very well to using the DataSet because of its structured and disconnected nature.

However, the application must also be able to query the server directly to validate logins, save enrollments, and perform other data access tasks. For these tasks it will be more efficient for the managed provider to execute statements directly rather than using a DataSet. In addition, because the data store used in this application is SQL Server 2000, the SqlClient managed provider will be used because it communicates natively with SQL Server using TDS thereby increasing performance. As a result, the application will have the architecture as shown in Figure 7.4.

Figure 7.4 Quilogy Education architecture. This diagram depicts the architecture of the application used for the Quilogy education Web site.

To handle both types of data access two sets of classes will be created within the data services tier as shown in Figure 7.4. First, a set of data access classes will be developed through which the other components of the system will request and receive data. Secondly, the data access classes will rely on DataSet classes that encapsulate the data required to implement the features listed at the beginning of this section. Both of these sets of classes will use ADO.NET and the SqlClient managed provider to access SQL Server.

You'll also notice from Figure 7.4 that when making requests of SQL Server the application will use stored procedures and functions rather than inline SQL. Doing so has the advantage of providing an interface that abstracts the database schema from developers and their code so that the data is both simpler for developers to use and insulates code from schema changes (as long as the signatures of stored procedures and functions remain the same). It also increases performance because stored procedures don't require the additional resolution and query plan creation incurred when using inline SQL.

There are two advantages to this architecture. First, it abstracts the data access code from the business and user services tiers. In this way changes to the location and structure of the data do not have rippling effects either on the business rules or user interface. Secondly, by creating custom DataSet classes to embody the data being passed between tiers we can provide strongly typed access to it that makes coding more efficient and less error prone.

We'll begin by creating the custom DataSet classes used in the application followed by the implementation of the data access classes.

Defining the Data Structure

ADO.NET is designed to facilitate the passing of data between tiers of a distributed application easily using the DataSet. As a result the example application takes advantage of the DataSet to collect information from users and pass it to business and data access components in the middle and data tiers.

To support the passing of data, the data tier is responsible for defining the structure of the data passed back and forth. To do this the application will use multiple classes derived from DataSet, the most important of which are shown in Table 7.7.

Table 7.7  Application DataSets. This table lists the DataSet classes used in the example Quilogy education application. provider.




Defines the data returned for a student when they view their transcript.


Defines the data returned when querying for a list of available classes given a variety of criteria.


Defines the data captured when enrolling a student in a class.


Defines the demographic data for one or more students.

The remainder of this section discusses the various techniques used to create the custom DataSet classes shown in Table 7.7.

Creating a DataSet

ADO.NET supports three techniques for creating data sets including automatic creation, programmatic creation, and using the XML Designer in VS.NET.

Automatic Schema Creation

The first option, automatic creation, was used in Listing 7.5 when populating the DataSet that contained the Courses and Products DataTable objects by simply invoking the Fill method of the SqlDataAdapter and passing in the name of a DataTableMapping that does not exist. In this case the column names and data types extracted from the database are used as the names of the DataColumn objects and by default each DataColumn is mapped to an element in the XML schema of the DataSet. In addition, the XSD schema creates a unioned rather than hierarchical representation of the data when there are multiple DataTable objects. For example, Listing 7.6 shows the XSD schema that was automatically created for the DataSet in Listing 7.5 containing the Courses and Products DataTable objects.


A discussion of the structure of XSD schemas is beyond the scope of this book. For more information see http://www.w3.org/XML/Schema.

Listing 7.6  XSD Schema. The schema produced for the DataSet created in Listing 7.5 using automatic population.

<xsd:schema id="Offerings" targetNamespace="" xmlns=""
 <xsd:element name="Courses">
    <xsd:element name="CourseID" minOccurs="0" type="xsd:int"/>
    <xsd:element name="CourseNum" minOccurs="0" type="xsd:string"/>
    <xsd:element name="Description" minOccurs="0" type="xsd:string"/>
    <xsd:element name="ProductID" minOccurs="0" type="xsd:int"/>
    <xsd:element name="LOBID" minOccurs="0" type="xsd:int"/>
    <xsd:element name="Days" minOccurs="0" type="xsd:unsignedByte"/>
    <xsd:element name="CourseWareCost" minOccurs="0" type="xsd:decimal"/>
 <xsd:element name="Products">
    <xsd:element name="ProductID" minOccurs="0" type="xsd:int"/>
    <xsd:element name="Name" minOccurs="0" type="xsd:string"/>
    <xsd:element name="VendorID" minOccurs="0" type="xsd:int"/>
 <xsd:element name="Offerings" msdata:IsDataSet="true">
   <xsd:choice maxOccurs="unbounded">
    <xsd:element ref="Courses"/>
    <xsd:element ref="Products"/>

You'll notice that the schema contains the root element Offerings, which consists of repeating Courses elements followed by repeating Products elements and that they are not related to each other. In addition the schema does not indicate which elements are required, maps each column to an element, and uses the database column names.

If this approach were used in an application built as shown in Figure 7.4, the data access classes would be responsible for building the DataSet objects on-the-fly and returning them through methods exposed by the data access classes. The methods would return the type DataSet rather than a strongly typed class inherited from DataSet because the DataSet is created in a just-in-time fashion. This technique has the benefit of simplicity because the creation of custom DataSet classes is not required but is limiting in other respects.

Although automatic creation is certainly the simplest way to build the DataSet, it is obviously limited in several ways including:

  • The names of the DataColumn objects are controlled by the underlying data store rather than the application.

  • The structure of the XML produced is flat rather than hierarchical and does not contain constraint information.

  • The DataSet produced is not strongly typed.

The first limitation means that the data store and DataSet are tightly coupled so that changes to database column names and table structures will have rippling effects as the data is used in applications within bound controls and code written to manipulate the data in the presentation services. The second limitation severely restricts the use of the XML produced to applications that manipulate the data as a DataSet rather than an XML document. This is the case because vital metadata about the relationships between elements is not automatically shown. Finally, by using late bound access to the DataSet your code is less readable, which increases development time along with the possibility of run-time rather than compile-time errors.

As a result, in all but the simplest applications you'll want to decouple the structure of the DataSet from the data store used to populate it and include all the relevant metadata using a typed DataSet. Doing so also decreases the amount of code you'll need to write in the data access classes. To create a typed DataSet you'll need to rely on other techniques discussed later in this section.

Although this discussion has thus far focused on creating DataSet objects from a persistent data store such as SQL Server, you can also create and populate a DataSet from standalone XML documents as well. This would be advantageous when you want to load XML documents and save them directly to a relational database.

To create the structure of a DataSet from an XML document, the DataSet class supports the InferXmlSchema method as the analog to automatic schema creation using the Fill method. For example, consider the simple XML document that follows:

<Course ID="22321">
  <City>Kansas City</City>

This document can be passed to the InferXmlSchema method to automatically create the schema that follows using a process referred to as Inference. Note that the data, however, does not get loaded.

Dim dsCourseData As New DataSet
dsCourseData.InferXmlSchema("CourseData.xml", Nothing)

The method has a variety of overloaded signatures that allow you to pass the XML using a Stream, XmlReader, or TextReader (discussed in Chapter 11). Note that the second argument is set to Nothing but can be populated with an array of strings containing namespace URIs that are to be ignored when creating the schema. In this way you can selectively create a schema from only certain elements in the document. The resulting schema is shown in Listing 7.7. Consult the documentation for the rules regarding how the schema is inferred.

Listing 7.7  Inferred XSD schema. This listing shows the XSD schema resulting from calling the InferXmlSchema method of the DataSet class.

 <xsd:schema id="NewDataSet" targetNamespace="" xmlns=""
  <xsd:element name="Course">
        <xsd:element name="City" minOccurs="0" type="xsd:string"/>
        <xsd:element name="Date" minOccurs="0" type="xsd:string"/>
        <xsd:element name="Students" minOccurs="0" type="xsd:string"/>
        <xsd:element name="Instructor">
           <xsd:element name="FName" minOccurs="0"
           <xsd:element name="LName" minOccurs="0"
    <xsd:attribute name="ID" type="xsd:string"/>
  <xsd:element name="NewDataSet" msdata:IsDataSet="true">
    <xsd:choice maxOccurs="unbounded">
     <xsd:element ref="Course"/>

Inferring schemas in this fashion is most useful in a design environment where you need to reverse engineer existing XML documents in order to create DataAdapters that populate them. You'll also want to edit the resulting schema to include appropriate data types, required elements and attributes, and constraint information because the data types are all defaulted to string, the elements and attributes are not required, and there are no constraints as evidenced by Listing 7.7.

To populate the DataSet from an XML document, you can use the ReadXml method analogous to the Fill method as follows.


In this case, the file is passed as the only argument. Note that calling ReadXml and leaving off the optional second argument both infers the schema and loads the data. The second argument can include one of the XmlReadMode enumerations Auto, DiffGram, IgnoreSchema, InferSchema, ReadSchema, or SqlXml. As you might expect Auto is the default and both infers the schema and loads the data. The ignore, infer, and read schema options can be used to bypass the schema if one exists in the document, infer the schema only, or read only the schema from the document, respectively. The DiffGram option will be discussed in more detail later in the chapter.

Programmatic Creation

The second technique for creating a DataSet is to use the programmatic interfaces exposed by the DataSet. Often this technique is used in conjunction with creating what is referred to as a typed DataSet. As mentioned in the first part of this section, the application architecture in this case calls for the creation of custom DataSet classes to represent the information shown in Table 7.7.

To create a typed DataSet you can simply create a custom class and inherit from the DataSet class. In this way all the members in addition to your own members of the DataSet class are available to consumers of the typed DataSet. You can then add custom code to programmatically create DataTable objects, relations, and constraints as required. For example, the simple TranscriptData class from Table 7.7 inherits from DataSet. In its constructor is a call to a private method that is responsible for building and customizing the DataTable object shown in Listing 7.8.

Listing 7.8  A simple typed DataSet. In this example a typed DataSet, TranscriptData, is created by inheriting from DataSet. It includes code to create the structure of the DataSet when a new instance is created.

Option Strict On
Option Explicit On

Imports System
Imports System.Data
Imports Microsoft.VisualBasic

Namespace Quilogy.Education

  Public Class TranscriptData
    Inherits DataSet

  Private mdtClass As DataTable
   Public Event ValidationWarning(ByVal pMessage As String)
    Public Const STUDENT_ID As String = "StudentID"
    Public Const ENROLL_ID As String = "EnrollID"
    Public Const CLASS_DATE As String = "Date"
    Public Const CITY As String = "City"
    Public Const INSTRUCTOR As String = "Instructor"
    Public Const INS_EMAIL As String = "InstructorEmail"
    Public Const VENDOR_NAME As String = "VendorName"
    Public Const COURSE_NUM As String = "CourseNum"
    Public Const COURSE_DESC As String = "CourseDesc"
    Public Const DAYS As String = "Days"
    Public Const ORGANIZATION As String = "StudentOrganization"

    Public Sub New()
     '  Initialize a TranscriptData instance by building the table

     ' Create the tables in the dataset

     ' Initialize the class
     Me.DataSetName = "QuilogyTranscript"
     Me.Namespace = "http://www.quilogy.com/education"
     Me.Prefix = "qed"
     Me.CaseSensitive = False

     ' Capture the ColumnChangingEvent
   AddHandler mdtClass.ColumnChanging, AddressOf Me.TranscriptColChanging

    End Sub

    Private Sub TranscriptColChanging(ByVal sender As Object, ByVal e As 
    Dim strOrig As String
     ' Do simple data validation
     Select Case e.Column.ColumnName
      Case COURSE_DESC
       ' Truncate long descriptions
       If Len(e.ProposedValue) > 255 Then
        strOrig = CType(e.ProposedValue, string)
        e.ProposedValue = Left(CType(e.ProposedValue, string),255)
        RaiseEvent ValidationWarning("The " & COURSE_DESC & " '" & strOrig & _
         "' was truncated to '" & CType(e.ProposedValue, string) & "'")
       End If
     End Select
    End Sub

  Public ReadOnly Property ClassTable As DataTable
      Return mdtClass
   End Get
    End Property

    Private Sub BuildTables()
     Dim pk() As DataColumn

     ' Create the transcript table
     mdtClass = New DataTable("Class")

     With mdtClass.Columns
       .Add(STUDENT_ID, GetType(System.Int32))
       .Add(ENROLL_ID, GetType(System.Int32))
       .Add(CLASS_DATE, GetType(System.Date))
       .Add(CITY, GetType(System.String))
       .Add(INSTRUCTOR, GetType(System.String))
       .Add(INS_EMAIL, GetType(System.String))
       .Add(VENDOR_NAME, GetType(System.String))
       .Add(COURSE_NUM, GetType(System.String))
       .Add(COURSE_DESC, GetType(System.String))
       .Add(DAYS, GetType(System.Byte))
       .Add(ORGANIZATION, GetType(System.String))
     End With

     ' Set the column attributes
   With mdtClass
      .Columns(STUDENT_ID).ColumnMapping = MappingType.Attribute
      .Columns(ENROLL_ID).ColumnMapping = MappingType.Hidden
      .Columns(CLASS_DATE).AllowDBNull = False
      .Columns(CLASS_DATE).Caption = "Class Date"
      .Columns(CITY).AllowDBNull = False
      .Columns(COURSE_NUM).AllowDBNull = False
      .Columns(DAYS).DefaultValue = 0
      .Columns(VENDOR_NAME).Caption = "Vendor Name"
      .Columns(COURSE_NUM).Caption = "Course Number"
      .Columns(COURSE_DESC).Caption = "Description"
      .Columns(ORGANIZATION).Caption = "Your Organization"
     End With

     ' Set the primary key
   ReDim pk(1)
     pk(0) = mdtClass.Columns(ENROLL_ID)
     mdtClass.PrimaryKey = pk

     ' Add the table to the collection

    End Sub
  End Class
End Namespace

Perhaps the first aspect of Listing 7.8 you'll notice is that the TranscriptData class is placed in the Namespace statement. In this case the class will exist in the Quilogy.Education namespace discussed in Chapter 4, Figure 4.1.

Secondly, notice that all the work is done in the constructor of the class where the base class constructor is called before a private method, BuildTables, which creates a DataTable object and populates it with DataColumn objects. In addition, the DataSet is initialized in the constructor by setting the name and namespace properties.

Because this is a simple DataSet, within BuildTables only a single DataTable, mdtClass, is created and exposed with the read-only property ClassData. After the columns are added, they are customized using the properties of the DataColumn object. For example, the Data, City, and CourseNum columns are set to not allow nulls while the mapping of StudentID and EnrollID are changed from element to attribute and hidden, respectively. Finally, the primary key of the DataSet is created using the EnrollID column before the DataTable is added to the DataTableCollection of the DataSet. Notice that the names of the columns are created as constants so that they can be more easily changed if necessary without affecting code that uses this class.

Because TranscriptData is a custom class like any other in VB.NET it can also include its own members. In this case not only is the ClassTable property added to expose the mdtClass DataTable but a Public event, ValidationWarning has also been defined. This event is raised in the Private TranscriptColChanging procedure that handles the ColumnChanging event for mdtClass discussed in Table 7.6. You'll notice that in this event handler the DataColumnChangeEventArgs class exposes a ProposedValue that you can inspect and change to do simple data validation along with a reference to the actual DataColumn being changed. In this case if the CourseNum column has been changed to a string with a length greater than 255 characters it is truncated and a ValidationWarning event is fired with the particulars.


Note that if an exception is raised in the ColumnChanging event handler the ProposedValue will not be reflected in the DataColumn and so it can be used for more sophisticated validation as well.

Using the class in Listing 7.8 a consumer of the TranscriptData class can be assured that the structure of the DataSet is immutable and can simply populate it as follows:

Imports Quilogy.Education
Imports System.Data.Sql

Dim dsTs As New TranscriptData
Dim parmWork As SqlParameter
Dim daSql As SqlDataAdapter
Dim parmSql As SqlParameter

daSql = New SqlDataAdapter("usp_GetTranscript", cnSQL)

daSql.SelectCommand.CommandType = CommandType.StoredProcedure
parmSql = daSQL.SelectCommand.Parameters.Add(New SqlParameter("@StudentID", 
parmSql.Value = intStudentID


In this particular case, the name of the DataTable is passed as the second argument to the Fill method because no explicit table mapping was created. As a result the usp_GetTranscript stored procedure must produce columns with the same names as the DataColumn objects added to the table in Listing 7.8. If this is not the case the developer writing the code in the previous snippet must create a table mapping.

The resulting XSD and XML data returned by the Xml property of the DataSet for a particular student can be seen in Listing 7.9.

Listing 7.9  Completed schema and data for a transcript. This listing shows the XSD schema produced in Listing 7.7 along with data returned from a stored procedure.

<qed:QuilogyTranscript xmlns:qed="http://www.quilogy.com/education">
 <xsd:schema id="QuilogyTranscript" 
  <xsd:element name="Class">
     <xsd:element name="Date" msdata:Caption="Class Date"
     <xsd:element name="City" type="xsd:string"/>
     <xsd:element name="Instructor" minOccurs="0" type="xsd:string"/>
     <xsd:element name="InstructorEmail" minOccurs="0" type="xsd:string"/>
     <xsd:element name="VendorName" msdata:Caption="Vendor Name"
       minOccurs="0" type="xsd:string"/>
     <xsd:element name="CourseNum" msdata:Caption="Course Number"
     <xsd:element name="CourseDesc" msdata:Caption="Description"
       minOccurs="0" type="xsd:string"/>
     <xsd:element name="Days" default="0" minOccurs="0"
     <xsd:element name="StudentOrganization" msdata:Caption="Your Organization"
       minOccurs="0" type="xsd:string"/>
     <xsd:element name="InstructorName" minOccurs="0" type="xsd:string"/>
    <xsd:attribute name="StudentID" type="xsd:int"/>
  <xsd:element name="QuilogyTranscript" msdata:IsDataSet="true">
    <xsd:choice maxOccurs="unbounded">
     <xsd:element ref="Class"/>
 <Class StudentID="3" xmlns="http://www.quilogy.com/education">
  <City>Des Moines</City>
  <CourseNum>827 </CourseNum>
  <CourseDesc>SMS Core Technologies</CourseDesc>
 <Class StudentID="3" xmlns="http://www.quilogy.com/education">
  <CourseDesc>Implementing a Database Design on SQL Server 2000</CourseDesc>

Although this approach provides a strongly typed DataSet with a predefined structure, you'll notice that it does not provide strongly typed access to the actual columns within the DataSet. As a result you're vulnerable to run-time errors if typos are introduced when accessing columns. For example, to retrieve the CourseNum from the second row of the DataSet shown in Listing 7.9 the following code snippet could be used:

strCourse = dsTs.ClassTable.Rows(1).Item("CourseNum"))

Obviously, if the literal string shown in this snippet was mistyped the compiler would not be able to catch the error and an exception will be thrown at runtime. Finally, Intellisense will not be activated to show a drop-down list of the available columns when typing statements such as these.

However, with the addition of the public constants for the column names, typing this line of code is a little easier:

strCourse = dsTs.ClassTable.Rows(1).Item(TranscriptData.COURSE_NUM))

In either case, and more importantly, if the code was setting the CourseNum field the data would not be type-checked at compile time, another potential source of errors. As a result this approach can be considered as a technique midway between using an untyped DataSet as discussed in the first part of this section and fully a typed DataSet discussed later in the chapter.

The advantage to this approach is that while the class, TranscriptData, and its DataTable are well-defined, the columns can be changed rather easily by modifying the code in the private BuildTables method and simply adding constants. If the structure of the DataSet changes it will definitely affect code that uses it, however, the public interface of the TranscriptData class will not change. This increases flexibility while still providing a strongly typed class.

Using the XML Designer

The final technique for creating a DataSet is to use the XML Designer included in VS.NET. This tool enables you to graphically create the schema and then save it as an XSD file.

By right-clicking on the project in the VS.NET project explorer and selecting Add Class, the resulting dialog allows you to select XSD schema. Within the designer you can right-click to add elements, attributes, types, and groups. You can also add relationships between elements to create a hierarchical schema. For example, the ScheduleData schema was created using the designer, a portion of which you can see in Figure 7.5.

Figure 7.5 The XML Designer. Here is the designer surface where elements are created and associated with other elements graphically.

By selecting the Generate DataSet option from the Context menu the appropriate attributes are added so that the schema can be loaded as a DataSet.

After the schema has been successfully created with the XML Designer or another tool, it can be read into a DataSet to create the underlying DataSet collections using the ReadXmlSchema method of the DataSet class. This method has several overloaded signatures to accept input from a TextReader, Stream, XmlReader, or a file on the file system. For example, a typed DataSet such as TranscriptData could use the ReadXmlSchema method in its constructor to create the DataSet structure rather than manipulating the collections programmatically.

Public Class TranscriptData
 Inherits DataSet

 Public Sub New()
  '  Initialize a TranscriptData instance by building the table

  ' Create the tables in the dataset
 End Sub
End Class

The advantage to this approach is that if changes are required to the schema they can be made in a file rather than having to modify code and recompile the class. The disadvantage is the extra file system access required. Therefore for online applications this technique is not recommended although for applications such as a Windows service it can be used to increase maintainability and flexibility.


As discussed in a previous section, the ReadXml method is used to read the XML document into the DataSet. For example, if a Windows service is written to process XML documents it might first create an instance of the DataSet and populate the schema by calling ReadXmlSchema. Each document is then loaded using the ReadXml method. If the documents do not conform to the schema, an exception will be thrown and the service can then take the appropriate action.

Finally, just as you can read a schema and populate a DataSet from an XML file or stream you can also write both the schema and data using the WriteXmlSchema and WriteXml methods of the DataSet class. For example, to write both the schema and data to a file, call the WriteXml method as follows:

Dim dsTs As TranscriptData

dsTs.WriteXml("transcripts.xml", XmlWriteMode.WriteSchema)

The optional second argument to the WriteXml method affects how the data is written to the file by using the XmlWriteMode enumeration and can also be set to IgnoreSchema to write only the data and DiffGram.


When using the DiffGram enumeration of XmlWriteMode, the XML document produced is called an updategram and contains elements that show the before and after states of the data. This updategram can be used to create an XML representation of the data set that includes changes to pass to another method for update. In addition the XML grammar used is the same as that supported by SQL Server 2000 so that they can be passed directly to SQL Server 2000 to update tables in a database. Note that the ReadXml method also supports the XmlReadMode enumeration DiffGram so that an updategram can be loaded into a DataSet.

While the WriteXml method supports writing only to a file, the overloaded WriteXmlSchema method can write to a Stream, TextWriter, XmlWriter, or file.

Strongly Typed DataSets

As discussed in the previous section and shown in Listing 7.8, creating a typed DataSet by inheriting from the DataSet class allows you to extend the DataSet while encapsulating the structure and functionality. However, the TranscriptData class is only a partially typed DataSet because you can reference the class itself and its DataTable through strongly typed variables but not the rows or columns. Doing so would increase the readability and ease of working with the DataSet.

Creating a Strongly Typed DataSet

In order to create a strongly typed DataSet you could manually extend the TranscriptData class to include subclasses for the DataRow and DataColumn objects exposed by the DataSet. However, the .NET Framework simplifies this by including the XML Schemas/DataTypes support utility (XSD.exe) that can, among other things, create a derived DataSet from an XSD or XDR (XML-Data Reduced) schema.


The XSD utility can also be used to infer XSD schemas from XML documents and convert XDR schemas into XSD schemas.

To illustrate the use of XSD.exe, consider the XSD schema shown in Listing 7.10. This schema was created to correspond to the data that is captured when registering one or more students for a class at Quilogy.

Listing 7.10  XSD schema for enrollment data. This schema includes the data captured for an enrollment in a Quilogy class.

<xsd:schema id="EnrollmentData" 
 <xsd:simpleType name="YesNo">
  <xsd:restriction base="xsd:string">
   <xsd:enumeration value="Y"/>
   <xsd:enumeration value="N"/>
 <xsd:element name="Enrollment">
    <xsd:element name="Cost" minOccurs="1" maxOccurs="1" type="xsd:decimal"/>
    <xsd:element name="Student">
      <xsd:element name="FName" minOccurs="0" type="xsd:string"/>
      <xsd:element name="LName" minOccurs="0" type="xsd:string"/>
      <xsd:element name="Organization" minOccurs="0" type="xsd:string"/>
      <xsd:element name="Comments" minOccurs="0" type="xsd:string"/>
      <xsd:element name="ContactInfo">
         <xsd:element name="Address" minOccurs="1" type="xsd:string"/>
         <xsd:element name="City" minOccurs="1" type="xsd:string"/>
         <xsd:element name="State" minOccurs="1" type="xsd:string"/>
        <xsd:element name="Zip" minOccurs="1" type="xsd:string"/>
        <xsd:element name="Phone" minOccurs="1" type="xsd:string"/>
        <xsd:element name="Email" minOccurs="0" type="xsd:string"/>
     <xsd:attribute name="Existing" use="required" type="YesNo"/>
     <xsd:attribute name="ID" type="xsd:int"/>
   <xsd:attribute name="ClassID" type="xsd:int" use="required"/>
   <xsd:attribute name="WebEnroll" type="YesNo" use="required"/>
   <xsd:attribute name="PaymentType" type="xsd:string" use="required"/>
 <xsd:element name="QuilogyRegistration">
    <xsd:element ref="Enrollment" minOccurs="1" maxOccurs="unbounded"/>

Note that this schema includes both student and class data that reflect the person taking the course and the class they signed up for. To create a strongly typed class from this schema the XSD.exe utility can be run at the command line as follows:

xsd /l:vb /d EnrollmentData.xsd

where the /l command specifies the language in which to create the DataSet and /d indicates that we want to create a class derived from DataSet. Alternatively, the utility can create simple classes by specifying the /c option that uses the System.Xml.Serialization namespace to serialize the class to the XML schema.


The full code for the EnrollmentData class and other code not shown in the listings can be found on the Sams Web site.

The resulting .vb file contains a single high-level class called EnrollmentData derived from the DataSet class with the structure as viewed in the Class Explorer window in VS.NET (see Figure 7.6). To get an understanding of the how the tool creates the DataSet, the classes and their descriptions are listed in Table 7.8.

Figure 7.6 A strongly typed DataSet. This is the Class Explorer view in VS.NET for the EnrollmentData class generated by the XSD.

Table 7.8  Strongly typed DataSet. This table lists the classes created by the XSD utility for the schema in Listing 7.10.




Highest level class derived from DataSet and implementing IEnumerable. Includes a private InitClass method that creates the table structure and relationships. Exposes DataTable objects as properties.


Child class of EnrollmentData derived from DataTable and implements IEnumerable. Exposes each DataColumn as a property and declares each of the public events. Contains an InitClass private method used to create the columns. Includes AddStudentRow, NewStudentRow, and RemoveStudentRow methods.


Child class of EnrollmentData derived from DataRow. Exposes each column as a strongly typed property. Includes methods to handle database null values and return all related ContactInfoRow objects.


Child class of EnrollmentData derived from EventArgs. Passes data for the events associated with StudentDataTable.


Analgous to StudentDataTable.


Analgous to StudentDataRow.


Analgous to StudentDataRowChangeEvent.


Analgous to StudentDataTable.


Analgous to StudentDataRow.


Analgous to StudentDataRowChangeEvent.


By default, the XSD tool also creates a set of the three classes for the QuilogyRegistration element. However, because Enrollment is the highest level repeating element, the QuilogyRegistration classes can be removed from the file.

As you can see from Table 7.8 the EnrollmentData class uses a series of child classes to implement each of the DataTable and DataRow objects. In this case, there are three tables created: One that represents the high level Enrollment element (EnrollmentDataTable); one that represents each Student element within an Enrollment element (StudentDataTable); and a third to represent the ContactInfo element (ContactInfoDataTable) within the Student element.

As discussed in the previous section, DataTable objects are related within a DataSet using a DataRelation. In this case, the parent/child relationships of EnrollmentDataTable to StudentDataTable and StudentDataTable to ContactInfoDataTable are handled by two private DataRelation objects created in the InitClass method of EnrollmentData as illustrated in the following snippet:

' Class level declarations
Private relationEnrollment_Student As DataRelation
Private relationStudent_ContactInfo As DataRelation

' In InitClass of EnrollmentData
Me.relationEnrollment_Student = New DataRelation("Enrollment_Student", _
 New DataColumn() {Me.tableEnrollment.Enrollment_IdColumn}, _
 New DataColumn() {Me.tableStudent.Enrollment_IdColumn}, false)
Me.relationEnrollment_Student.Nested = true

Me.relationStudent_ContactInfo = New DataRelation("Student_ContactInfo", _
 New DataColumn() {Me.tableStudent.Student_IdColumn}, _
 New DataColumn() {Me.tableContactInfo.Student_IdColumn}, false)
Me.relationStudent_ContactInfo.Nested = true

Notice that the relationships are created based on the columns, Enrollment_Id and Student_Id, added to the appropriate DataTables. These columns act as the primary and foreign keys that implement the relationships. To be sure these columns do not appear in the XML document and to adhere to the schema, both of these columns are created with a MappingType of Hidden. In addition, the primary key columns, Enrollment_Id in EnrollmentDataTable and Student_Id in StudentDataTable are set to be auto-incrementing. In this way, each row in the DataColumn is assigned a unique primary key. The following snippet shows the creation of the Student_Id column in the StudentDataTable's InitClass method:

Me.columnStudent_Id = New DataColumn("Student_Id", GetType(System.Int32), "", 
Me.columnStudent_Id.AutoIncrement = true
Me.columnStudent_Id.AllowDBNull = false
Me.columnStudent_Id.Unique = true
Me.PrimaryKey = New DataColumn() {Me.columnStudent_Id}

Besides the properties shown you can also set the initial value (defaulted to 0) and increment (defaulted to 1) for the column using the AutoIncrementSeed and AutoIncrementStep properties, respectively. Note that this unique key is found only within the client side data cache (DataSet) and is not propagated back to the database. In other words, it is simply used as a tool for relating DataTable objects within the DataSet.

Using a Strongly Typed DataSet

After the DataSet has been created, a client can access it with distinct types exposed by the class. This allows for compile-time checking of the code and the assistance of Intellisense. For example, using the EnrollmentData class produced above the code in Listing 7.11 can be used to add a new enrollment to the DataSet.

Listing 7.11  Using the DataSet. This example shows how to use a strongly typed DataSet.

Imports Quilogy.Education

Dim dsEnrollment As New EnrollmentData
Dim drEnroll As EnrollmentData.EnrollmentRow
Dim drStudent As EnrollmentData.StudentRow
Dim drContactInfo As EnrollmentData.ContactInfoRow
Dim drStudents() As EnrollmentData.StudentRow
Dim strName As String

' Add a new enrollment
drEnroll = dsEnrollment.Enrollment.AddEnrollmentRow(1745,1234,"Y","CC")

' Add a student
dsEnrollment.Student.AddStudentRow("Sammy","Sosa","Chicago Cubs",_
  "There better be coffee","N",0,drEnroll)

' Add the contact information for the student
drContactInfo = dsEnrollment.ContactInfo.NewContactInfoRow
drContactInfo.Address = "3345 North Shore Drive"
drContactInfo.City = "Chicago"
drContactInfo.State = "IL"
drContactInfo.Zip = "43211"
drContactInfo.Phone = "3145551212"
drContactInfo.Email = "ssosa@cubs.com"
drContactInfo.StudentRow = drStudent


 ' Retrieve the students for the enrollment
drStudents = drEnroll.GetStudentRows
strName = drStudents(0).FName & " " & drStudents(0).LName

' Query the user
If MsgBox("Add enrollment for " & strName & " to class " & _
 drEnroll.ClassID & "?", MsgBoxStyle.YesNo+MsgBoxStyle.Question, _
 "ClassID") = MsgBoxResult.Yes Then
End If

You'll notice that in Listing 7.11 two distinct techniques for populating the DataTable objects are provided using overloaded methods such as AddStudentRow. First, a version of the method is created for each class that accepts arguments that map to the various columns. Internally this method uses the ItemArray property to set all the columns in a DataRow by passing an array of Object types in a single statement. It can also be used to retrieve the contents of a DataRow into an array of objects. The second technique, as shown when adding the contact information, uses the other method signature, in this case AddContactInfoRow, which accepts the ContactInfoRow as the parameter. In this case, the row is prepopulated using the strongly typed properties of the derived DataRow class. Note that the ContactInfoRow class exposes a StudentRow property that is used to call the SetParentRow method of the DataRow using a DataRelation object.

Listing 7.11 also exercises the methods used to return an array of child rows using the private DataRelation objects discussed earlier. In this case, the GetStudentRows method returns an array of StudentRow objects associated with the EnrollmentRow. Internally this method calls the GetChildRows method of the DataSet and passes it the relation through which to get the rows.

Finally, the client code uses a message box to query the user and invokes the AcceptChanges or RejectChanges methods of the DataSet as a result. Like in a database these methods serve to provide a certain level of transactional behavior at the DataRow, DataTable, and DataSet level. This is accomplished by manipulating the RowState property of the DataRow objects in the various DataTables. For example, AcceptChanges "commits" all the rows within the DataSet by changing the RowState property of any rows that are New or Modified to the Unchanged value of the DataRowState enumeration while removing any rows that have been Deleted. Conversely RejectChanges changes the RowState of all Modified and Deleted rows to Unchanged.


The DataRow also supports the BeginEdit and EndEdit methods that put the row in edit mode and take it out again. These are used to suspend events and allow multiple changes to be made to the row without validation occurring. Both methods are called implicitly, first when a column value is changed and then when AcceptChanges or RejectChanges is called.

After the DataSet has been populated, the XML produced from Listing 7.11 is shown here:

<QuilogyRegistration xmlns="http://www.quilogy.com/education">
 <Enrollment ClassID="1234" WebEnroll="Y" PaymentType="CC">
 <Student Existing="N">
  <Organization>Chicago Cubs</Organization>
  <AddressLine>3345 North Shore Drive</AddressLine>
  <Comments>Make sure to have hot coffee available</Comments>

Having seen the different techniques for creating data sets you may be wondering which of them to use. Obviously that question can only be answered by your particular project. However, several considerations include the size of the project, its expected lifetime, and the skill level of the development staff. Generally, the larger the project (meaning the wider reach it has within the organization) and the longer it is expected function, the more you would move toward a strongly typed approach. This is because you can more easily justify the increased maintenance costs and will likely expose the classes to developers with a wider range of skills. However, for the majority of projects, a weakly typed DataSet approach like that shown in Listing 7.8 is warranted because it provides the primary benefits of typed DataSets while still allowing for flexibility. As a result, in the remainder of this chapter we'll use DataSet classes based on that approach.

XML Integration

As should be evident from this chapter, the DataSet provides a relational view of data using an XML data store. While this chapter has spent some time reviewing the relationships between the structure of the DataSet and the XSD schema and XML produced, you can use a DataSet quite apart from any reference to XSD and XML in both weakly and strongly typed imple mentations. In fact, for developers not familiar with XML, this provides a great layer of abstraction.

On the other hand, many developers have worked with XML documents programmatically using the various versions of the Microsoft XML Parser (MSXML) and Document Object Model (DOM). As will be discussed in Chapter 13, those developers can still use the DOM through the System.Xml classes, the most important of which is XmlDocument. However, up until now those same developers had to write their own translation layer when attempting, for example, to read an XML document and save the contents to a relational database. Fortunately, the Services Framework integrates the programming models provided by the relational view of data using DataTable and DataRow objects in ADO.NET and the node-based hierarchical view using the DOM. This intermediary is the XmlDataDocument class, which resides in the System.Xml namespace.

Using XmlDataDocument

At the most basic level, the XmlDataDocument provides an XML view of a DataSet in a separate object so that it can be manipulated using the DOM. However, the XmlDataDocument can also be loaded with data independently, resulting in the creation of a DataSet and relational schema when the DataSet property of the object is accessed. Keep in mind that the XmlDataDocument and DataSet are separate objects at runtime and are kept in sync using events. As a result, creating an XmlDataDocument based on a large DataSet incurs extra overhead.


You can think of the XmlDataDocument as a more robust way of loading XML into a DataSet. Rather than use the ReadXmlSchema, ReadXml, WriteXmlSchema, and WriteXml you could instantiate and load an XmlDataDocument object and subsequently view it as a DataSet using the DataSet property.

To use an XmlDataDocument you can optionally pass an instance of a DataSet to the constructor. By doing so, the XmlDataDocument is populated and can then be traversed using the DOM. Because XmlDataDocument is derived from XmlDocument, all the classes and methods are familiar to users of the DOM are available. For example the following code snippet instantiates a new XmlDataDocument with the dsEnrollment DataSet populated in Listing 7.11. It then navigates through the XML using the methods of the DOM.

Dim xData As New XmlDataDocument(dsEnrollment)
Dim xRoot As XmlElement
Dim xEnroll As XmlElement
Dim xStudent As XmlElement
Dim xClass As XmlElement

xRoot = xData.DocumentElement

For Each xEnroll in xRoot.GetElementsByTagName("Enrollment")

 intClassID = xEnroll.Attributes("ClassID").Value
 curCost = xEnroll.FirstChild.FirstChild.Value

 For Each xStudent in xEnroll.GetElementsByTagName("Student")
  strLastName = 

You can also go the other direction by loading the XmlDataDocument using the LoadDataSetMapping method to load the schema and the Load or LoadXml methods. The data can then be manipulated using the DataSet property. The following snippet is the reverse of the previous one.

Dim xData As New XmlDataDocument
Dim drRow As DataRow
Dim drStudent As DataRow

xData = New XmlDataDocument

For Each drRow in xData.DataSet.Tables("Enrollment").Rows
 intClassId = drRow.Item("ClassID")
 curCost = drRow.Item("Cost"))

 For Each drStudent in 
  strLastName = drStudent.Item("LName")

Note that in this case the schema and the XML are loaded from files.

Once the initial mapping of the DataSet to the XmlDataDocument is complete it cannot be altered by calling LoadDataSetMapping a second time. In addition, if new nodes are added to the XmlDataDocument that do not correspond to rows in a table within the DataSet, the nodes will be added to the XmlDataDocument but not synchronized to the DataSet.

Although using the XmlDataDocument as an alternate view of a DataSet may at first seem unnecessary, it can be used to provide functionality not supported by the DataSet. As an example, consider an application that needs to automatically create multiple Web pages for an online catalog from data in a relational database each night. One technique for doing so is to fill a DataSet, view it through an XmlDataDocument and then use the System.Xml.Xsl.XslTranform class to output the data in HTML format using an XSL style sheet.

The code in Listing 7.12 shows an example of a method used to transform a DataSet into HTML or XML using an XmlDataDocument and saving the resulting document to a file on the file system.

Listing 7.12  Using the XmlDataDocument. This listing shows how you can use the XmlDataDocument to programmatically transform a DataSet into an HTML document.

Public Sub TransformDS(ByRef dsData As DataSet, ByVal pXSLFile As String, ByVal 
pDestFile As String)

Dim xData As New XmlDataDocument
Dim xTrans As New XslTransform
Dim xWriter As XmlTextWriter

 ' Load the stylesheet and transform
Catch e As Exception
 WriteToLog("Could not load XSL file " & pXSLFile & " : " & e.Message)
End Try
 ' Create an XmlTextWriter to write to the file
 xWriter = New XmlTextWriter(pDestFile,Nothing)

 ' Populate the XmlDataDocument and do the transform
 xData = New XmlDataDocument(dsData)
 xTrans.Transform(New DocumentNavigator(xData),Nothing,xWriter)

Catch e As Exception
 WriteToLog("Could not write data to " & pDestFile & " : " & e.Message)
End Try
End Sub

In this example, notice that the XslTransform object xTrans first loads the style sheet passed in as an argument using the Load method. After opening the destination file using an XmlTextWriter object, the XmlDataDocument object is populated by passing the DataSet as an argument to the constructor. The XmlDataDocument, in this case xData, is then passed to the constructor of a DocumentNavigator object in the Transform method. The DocumentNavigator is used internally by the Transform method to navigate through an XML document using a cursor model. The second parameter to Transform, here set to Nothing, can be used to specify a list of parameters that are fed into the XSL style sheet to allow for dynamic execution.

To use the method in Listing 7.12 the client application would simply create the DataSet and pass it to the method.

Dim cnSQL As SqlConnection
Dim daSQL As SqlDataAdapter
Dim dsCourses As New DataSet("Offerings")

cnSQL = New 

' Create the adapter and set the SelectCommand through the constructor
daSQL = New SqlDataAdapter("usp_ListCourses", cnSQL)
daSQL.SelectCommand.CommandType = CommandType.StoredProcedure

' Populate the DataSet

' Transform and Save
TransformDS(dsCourses, "CourseList.xsl","Courses.htm")

Handling Data Access

Besides custom DataSet classes, the data services tier shown in Figure 7.4 consists of data access classes. These classes are responsible for communicating with the backend database by abstracting the calls to the managed provider. They are called from both the presentation and business services tiers.

When communicating with the database, the data access classes will use stored procedures exclusively. Besides the performance and abstraction benefits already mentioned, by allowing stored procedures to be the gatekeepers of the data, security can be enhanced and some business logic or data validation can be introduced at the data source. Once again, the primary benefit of using stored procedures rather than inline SQL is that the code running on a middle-tier server need not generate SQL or have an understanding of the database schema.

Specifically, in the Quilogy education example application there exists data access classes for each of the major namespaces shown in Figure 4.1. In the remainder of this chapter we'll focus on the Students class responsible for manipulating student data within the database. The primary methods of this class are shown in Table 7.9.

Table 7.9  Students data access class. This tables lists the major methods of the class responsible for manipulating student data.




Method that returns the TranscriptData class for a particular student.


Validates a student login and returns the student's demographic information in a StudentData class.


Allows a student to change his password.


Both inserts and updates student data through a StudentData class.


Deletes a student from the database.


Determines whether a student exists in the database given their e-mail address and last name.


Returns the number of classes taken by a particular student.

Obviously other methods could be added to this list, for example, querying a list of students based on some criteria, but those shown here represent the core functionality required for the online application.

To get a feel for the structure of a data access class, the code for the class and the GetTranscript method is shown in Listing 7.13.

Listing 7.13  Students data access class. This class encapsulates the data access code for student data.

Option Explicit On
Option Strict On

Imports System
Imports System.Data
Imports System.Data.SqlClient
Imports Quilogy.Education

Namespace Quilogy.Education.Enrollment

Public Class Students
' Students Data Access Class

 Private mcnSql As SqlConnection

 Public Class StudentsException
  Inherits ApplicationException
 ' Custom exception for arguments

  Public Sub New(ByVal pMessage As String)
  End Sub

 End Class

 Public Sub New(ByVal pConnect As String)
  ' Pass in the connection string
  ' We'll change this if running in COM+
  mcnSql = New SqlConnection(pConnect)
 End Sub

 Public Function GetTranscript(ByVal pStudentID As Integer) As TranscriptData
 ' Fill a dataset with transcript data based on the student ID

  Dim dsTs As New TranscriptData
  Dim cnSql As SqlConnection
  Dim parmWork As SqlParameter
  Dim daSql As SqlDataAdapter
  Dim parmSql As SqlParameter

  ' Check the incoming paramters
  If pStudentID <= 0 Then
   Throw New StudentsException("StudentID cannot be less than or equal to 
   Return Nothing
  End If

  ' Setup the data adapter
  daSql = New SqlDataAdapter("usp_GetTranscript", mcnSQL)

  ' Call the stored procedure
  daSql.SelectCommand.CommandType = CommandType.StoredProcedure
  parmSql = daSQL.SelectCommand.Parameters.Add(New SqlParameter("@StudentID",
  parmSql.Value = pStudentID

  Return dsTs

 End Function
End Class

End Namespace

Note that the class exists in the Quilogy.Education.Enrollment namespace and must import not only the standard namespaces such as System.Data and System.Data.SqlClient, but also the Quilogy.Education namespace because the TranscriptData class exists there.

The Students class includes the child class StudentsException derived from ApplicationException and used to throw exceptions when data passed to the class is in error. This allows client code to differentiate between errors that may be correctable and those that originated with the managed provider or backend database. You'll also notice that the constructor of this class accepts an argument used as the ConnectionString of the SqlConnection object stored at the class level.


Although there are several ways to pass the connection string to a class (for example by referencing a separate class created explicitly for configuration settings), this approach is flexible in situations where the Students class may be used in a variety of scenarios. As we'll see in Chapter 8, if the class uses COM+ services it can take advantage of object construction where the connection string is configured in the COM+ application.

The GetTranscript method first checks the incoming parameter and may throw the StudentsException if it is invalid. The method then uses the familiar SqlDataAdapter to execute the stored procedure as discussed previously in this chapter. Note once again that the stored procedure was specifically created to map to the columns of the DataTable within TranscriptData. If the stored procedure cannot be created to match the DataSet a TableMapping must be created before the Fill method is called. To ensure that the DataTable contains only the columns specified in the TranscriptData class, the MissingSchemaAction property is set to Ignore so that any additional columns returned from the stored procedure are not added to the DataSet.

A client using the Students class would contain code like the following:

Imports Quilogy.Education.Enrollment
Imports Quilogy.Education
Dim objStudents As New Students(strConnect)
Dim dsTranscript As TranscriptData
Dim intStudentID As Integer

 dsTranscript = objStudents.GetTranscript(intStudentID)
Catch e As Students.StudentsException
 ' Possibly prompt the user again or try to repopulate intStudentID
 Exit Sub
Catch e As Exception ' Must be a database error
 ' Display an error to the user
 Exit Sub
End Try

Both of the Quilogy.Education namespaces must be imported because the Students class is found in one and the TranscriptData class in another. Note also that the Try block contains two Catch statements, the first to catch any StudentsExceptions that occur because of invalid parameters and the second to catch any database errors.

Modifying Data

One of the most important responsibilities of the data access classes is to modify data in the underlying database. In the Students class, the Save method is exposed to handle both the insertion of a new student and the update of demographic information. The entire method is shown in Listing 7.14.

Listing 7.14  The Save method. This method calls a stored procedure to insert or update a student in the database.

Public Function Save(ByVal pStudent As StudentData) As Integer
' Handles both inserts and updates through the proc and returns the new 

 Dim cmSql As SqlCommand
 Dim parmSql As SqlParameter

 ' Check the incoming parameter
 If pStudent Is Nothing Then
  Throw New StudentsException("No data found!")
  Return 0
 End If

 ' Create the parms and extract the data
 With cmSql
  .CommandType = CommandType.StoredProcedure
  .Parameters.Add(New SqlParameter("@StudentID",SqlDbType.Int))
  .Parameters("@StudentID").Value = 
  .Parameters.Add(New SqlParameter("@FName",SqlDbType.VarChar))
  .Parameters("@FName").Value = 
  .Parameters.Add(New SqlParameter("@LName",SqlDbType.VarChar))
  .Parameters("@LName").Value = 
  .Parameters.Add(New SqlParameter("@Company",SqlDbType.VarChar))
  .Parameters("@Company").Value = 
  .Parameters.Add(New SqlParameter("@Email",SqlDbType.VarChar))
  .Parameters("@Email").Value = 
  .Parameters.Add(New SqlParameter("@Address",SqlDbType.VarChar))
  .Parameters("@Address").Value = 
  .Parameters.Add(New SqlParameter("@City",SqlDbType.VarChar))
  .Parameters("@City").Value = 
  .Parameters.Add(New SqlParameter("@State",SqlDbType.VarChar))
  .Parameters("@State").Value = 
  .Parameters.Add(New SqlParameter("@ZipCode",SqlDbType.VarChar))
  .Parameters("@ZipCode").Value = 
  .Parameters.Add(New SqlParameter("@Phone",SqlDbType.VarChar))
  .Parameters("@Phone").Value = 
 End With

 ' Set up the return value
 parmSql = New SqlParameter("RETURN",SqlDbType.Int)
 parmSql.Direction = ParameterDirection.ReturnValue

 ' Call the proc

 ' Return the new or old student id
 Return CType(parmSql.Value, Integer)

End Function

In Listing 7.14 notice that because no DataSet is being returned from the method, the SqlCommand object is used to execute a stored procedure to perform the insert or update. After instantiating the SqlCommand the parameters are added to the Parameters collection. Note that the SqlParameter class supports several overloaded constructors, one of which allows you to specify all the important properties. In this case however, the method provides the name of the parameter and the data type. The Direction property is defaulted to ParameterDirection. Input and so only the Value property must be additionally set. This is accomplished by accessing the data in the StudentData class (not shown). It is assumed that only one student exists in the StudentData DataSet although modifying the code to handle several students using a loop is straightforward.

The final parameter added after the With block corresponds to the return value of the stored procedure. SQL Server stored procedures can return a 32-bit integer value indicating status or other application-specific information. In this case, the usp_SaveStudent procedure returns the new or existing StudentID assigned to the student. Within SQL Server this value is generated by adding the IDENTITY property to the StudentID column in the table. Unlike in classic ADO, however, the parameter specifying the return value does not have to be the first added to the Parameters collection and need not have a specific name.

The SqlConnection is then opened and the procedure executed using the ExecuteNonQuery method of the SqlCommand object. This method specifies that the command will not return a resultset resulting in more efficient execution. The new or existing StudentID is then returned by accessing the Value property of the Parameter and converting it to an Integer.

One of the interesting aspects of this technique is that the determination of whether to perform an INSERT or DELETE statement in the database ultimately rests with the stored procedure rather than the Save method. This simplifies the calling convention for client applications and requires one less stored procedure per table.

An alternative approach to the one shown here is to instantiate a SqlDataAdapter in the Save method and populate the InsertCommand and UpdateCommand properties to point to stored procedures and then calling the Update method much as was done in Listing 7.3. While this would handle multiple rows without adding a loop, it would entail writing a few more lines of code to configure the commands. Then you'd have to be certain that the RowState properties of the individual rows were set correctly because they determine which command is executed for each row.

Although space prohibits a more complete discussion, the approach described in the previous paragraph is more suited to true batch scenarios where an application may insert, update, or delete multiple rows in the DataSet and then want to synchronize them with the database. In these scenarios the code responsible for updating the data store, uses the GetChanges method of the DataSet to create a copy of the DataSet that contains either all the changed rows or those filtered by the DataRowState enumeration. It should then check the HasErrors property of each DataTable to determine if there are validation errors on any of the rows. If so the HasErrors property of each DataRow can be inspected to find those rows that have errors, although a quicker way to return rows that contain errors is to call the GetErrors method of the DataTable, which returns an array of DataRow objects. For each of these rows, the RowError property will contain a string specifying the error.

Once the errors have been corrected, the copy created with GetChanges can be sent to a method in the data services tier. At this point the method can execute the Update method of a DataAdapter to perform the inserts, updates, and deletes. If the update completes successfully the AcceptChanges method can be called to reset the row states to unmodified. Conversely, RejectChanges can be called if errors occur. The data services tier may then return the DataSet to the presentation tier where it can be merged with the existing data using the Merge method.


If you write your own custom validation code in the presentation services tier, you can use the SetColumnError, GetColumnError, and ClearErrors methods of the DataRow to manipulate the error information.

System Assigned Keys

As pointed out in the previous section, this particular application relies on SQL Server IDENTITY values to generate system assigned keys to identify students, enrollments, courses, products, and other entities. These columns are in turn defined as the primary keys on their respective tables. However, in distributed applications, relying on keys generated from a single database may not be advisable in all scenarios.

One technique for bypassing IDENTITY values and yet retaining system assigned keys is to use Globally Unique Identifiers (GUIDs) for the keys.


Yes, GUIDs are difficult to read, increase the size of the data being passed around, and the table size in SQL Server. However, your users shouldn't be seeing system assigned keys and relative the cost of both bandwidth and disk space is decreasing. While I'm not advocating the exclusive use of GUIDs, they make sense for some applications, particularly those that use data from distributed database servers or are built in such a way that returning the newly assigned key from the database server is difficult.

These 16-byte unique values (calculated using an algorithm based on the MAC address of the network card and other information) work particularly well with SQL Server because version 7.0 and higher support the UniqueIdentifier data type. VB.NET supports GUIDs by allowing the DataSet to automatically create these values in the same way as a database allows default constraints on columns. The Services Framework also includes a Guid structure in the System namespace that can be used to generate new GUIDs on demand.

The Quilogy education application can be modified to support GUIDs in two ways. First, a UniqueIdentifier must replace the primary key of the tables (for example Student). Listing 7.15 shows what the new table would look like.

Listing 7.15  Student table. This version of the Student table uses a UniqueIdentifier as the data type for the StudentID.

CREATE TABLE [Student] (
  [StudentID] [uniqueidentifier] NOT NULL CONSTRAINT [DF_StudentID] DEFAULT 
  [FName] [varchar] (50) NULL ,
  [LName] [varchar] (50) NOT NULL ,
  [Company] [varchar] (50) NULL ,
  [ZipCode] [char] (10) NULL ,
  [City] [varchar] (50) NULL ,
  [State] [varchar] (50) NULL ,
  [Email] [varchar] (100) NULL ,
  [Address] [varchar] (1000) NULL ,
  [Pwd] [varchar] (10) NULL ,
  [Phone] [varchar] (12) NULL ,

Note that a default constraint is added to the StudentID column that invokes the NewID function. This Transact-SQL function generates a new GUID and will place it in the column if not provided in the INSERT statement.

Second, the StudentData class can be modified to change the data type of the StudentID column from Integer to System.Guid. In the BuildTables method the code that adds the columns would be changed as follows where mdtStudent references the private DataTable object.

With mdtStudent.Columns
 .Add(STUDENT_ID, GetType(System.Guid))
 ' Add the other columns
End With

Finally, the DefaultValue property of the DataColumn can be set to the NewGuid method of the System.Guid structure to generate a GUID when a new row is added to the DataTable.

With mdtStudent.Columns
 .Columns(STUDENT_GUID).DefaultValue = System.Guid.NewGuid
 ' Set the other properties
End With

The benefit of using this approach is that methods of the Students class such as Save do not have to return the StudentID information to the code that calls it. By generating the GUID directly in the DataSet the client already has access to it.

Returning Other Types of Data

Besides being able to pass input parameters to, and catch return values from, stored procedures, ADO.NET supports output parameters. For example the CheckIfExists method of the Students class is used to determine if a student already exists in the database given his e-mail address and last name. If so, the method returns the StudentID. Rather than have the stored procedure that implements this functionality return a resultset, it returns the StudentID as an output parameter. From SQL Server's perspective this is more efficient than creating a resultset. The Transact-SQL code for the stored procedure usp_StudentExists can be seen here:

CREATE PROC usp_StudentExists
@Email varchar(100) ,
@LName varchar(50),
@StudentID int OUTPUT

SELECT @StudentID = StudentID
FROM Student
WHERE Email = @Email AND LName = @LName

While this method could also have been implemented using a return value, output parameters are useful when you have more than one value to return (what would typically be referred to as returning a single row, multicolumn resultset) or need to return a single value of a data type other than Integer.

The CheckIfExists method is shown in Listing 7.16.

Listing 7.16  Using output parameters. This method uses an output parameter to return the StudentID.

Public Function CheckIfExists(ByVal pEmail As String, ByVal pLastName As 
String) As Integer

 Dim cmSql As SqlCommand
 Dim parmSql As SqlParameter

 ' Check the incoming parameter
 If pEmail.Length = 0 Or pLastName.Length = 0 Then
  Throw New StudentsException("Must supply email address and last name")
  Return 0
 End If

 cmSql = New SqlCommand("usp_StudentExists", mcnSql)

 ' Create the parms and extract the data
 With cmSql
  .CommandType = CommandType.StoredProcedure
  .Parameters.Add(New SqlParameter("@LName",SqlDbType.VarChar))
  .Parameters("@LName").Value = pLastName
  .Parameters.Add(New SqlParameter("@Email",SqlDbType.VarChar))
  .Parameters("@Email").Value = pEmail
  .Parameters.Add(New SqlParameter("@StudentID",SqlDbType.Int))
  .Parameters("@StudentID").Direction = ParameterDirection.Output
 End With

 ' Call the proc

 ' Check if Null before Return
 If cmSql.Parameters("@StudentID").Value Is DBNull.Value Then
  Return 0
  Return CType(cmSql.Parameters("@StudentID").Value, Integer)
 End If

End Function

One other interesting note found in Listing 7.16 is that the value returned by the output parameter may be NULL if the student did not exist. Null values can be checked for by comparing them to DBNull.Value, a singleton class in the System namespace. Note that comparing a NULL value returned from the database to Nothing will always return False and therefore lead to errors.

Related to the use of output parameters is the support for the ExecuteScalar method found in both the OleDbCommand and SqlCommand classes. This method returns only the first row and first column of a resultset and is useful in situations where an existing stored procedure must be used. This technique does not save any resources on the server side but is more efficient than using the ExecuteReader method and then manipulating the resulting SqlDataReader.

As an example of using this method, the ClassesTaken method of the Students class calls the stored procedure usp_GetClasses that performs an aggregate query using the COUNT() function.

CREATE PROC usp_GetClasses
@StudentID int

FROM Enrollment
WHERE StudentID = @StudentID

This value is then returned using the ExecuteScalar method as shown in Listing 7.17.

Listing 7.17  Returning a scalar value. This procedure returns the first column from the first row of the resultset using ExecuteScalar.

Public Function ClassesTaken(ByVal pStudentID As Integer) As Integer
' Get the number of classes taken by this student

 Dim cmSql As SqlCommand
 Dim parmSql As SqlParameter

 ' Check the incoming parameter
 If pStudentID <= 0 Then
  Throw New StudentsException("StudentID must be greater than 0")
  Return 0
 End If

 cmSql = New SqlCommand("usp_GetClasses", mcnSql)

 ' Create the parms and extract the data
 With cmSql
  .CommandType = CommandType.StoredProcedure
  .Parameters.Add(New SqlParameter("@StudentID",SqlDbType.Int))
  .Parameters("@StudentID").Value = pStudentID
 End With

 ' Call the proc
 ClassesTaken = CType(cmSql.ExecuteScalar, Integer)

End Function
  • + Share This
  • 🔖 Save To Your Account

InformIT Promotional Mailings & Special Offers

I would like to receive exclusive offers and hear about products from InformIT and its family of brands. I can unsubscribe at any time.


Pearson Education, Inc., 221 River Street, Hoboken, New Jersey 07030, (Pearson) presents this site to provide information about products and services that can be purchased through this site.

This privacy notice provides an overview of our commitment to privacy and describes how we collect, protect, use and share personal information collected through this site. Please note that other Pearson websites and online products and services have their own separate privacy policies.

Collection and Use of Information

To conduct business and deliver products and services, Pearson collects and uses personal information in several ways in connection with this site, including:

Questions and Inquiries

For inquiries and questions, we collect the inquiry or question, together with name, contact details (email address, phone number and mailing address) and any other additional information voluntarily submitted to us through a Contact Us form or an email. We use this information to address the inquiry and respond to the question.

Online Store

For orders and purchases placed through our online store on this site, we collect order details, name, institution name and address (if applicable), email address, phone number, shipping and billing addresses, credit/debit card information, shipping options and any instructions. We use this information to complete transactions, fulfill orders, communicate with individuals placing orders or visiting the online store, and for related purposes.


Pearson may offer opportunities to provide feedback or participate in surveys, including surveys evaluating Pearson products, services or sites. Participation is voluntary. Pearson collects information requested in the survey questions and uses the information to evaluate, support, maintain and improve products, services or sites, develop new products and services, conduct educational research and for other purposes specified in the survey.

Contests and Drawings

Occasionally, we may sponsor a contest or drawing. Participation is optional. Pearson collects name, contact information and other information specified on the entry form for the contest or drawing to conduct the contest or drawing. Pearson may collect additional personal information from the winners of a contest or drawing in order to award the prize and for tax reporting purposes, as required by law.


If you have elected to receive email newsletters or promotional mailings and special offers but want to unsubscribe, simply email information@informit.com.

Service Announcements

On rare occasions it is necessary to send out a strictly service related announcement. For instance, if our service is temporarily suspended for maintenance we might send users an email. Generally, users may not opt-out of these communications, though they can deactivate their account information. However, these communications are not promotional in nature.

Customer Service

We communicate with users on a regular basis to provide requested services and in regard to issues relating to their account we reply via email or phone in accordance with the users' wishes when a user submits their information through our Contact Us form.

Other Collection and Use of Information

Application and System Logs

Pearson automatically collects log data to help ensure the delivery, availability and security of this site. Log data may include technical information about how a user or visitor connected to this site, such as browser type, type of computer/device, operating system, internet service provider and IP address. We use this information for support purposes and to monitor the health of the site, identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents and appropriately scale computing resources.

Web Analytics

Pearson may use third party web trend analytical services, including Google Analytics, to collect visitor information, such as IP addresses, browser types, referring pages, pages visited and time spent on a particular site. While these analytical services collect and report information on an anonymous basis, they may use cookies to gather web trend information. The information gathered may enable Pearson (but not the third party web trend services) to link information with application and system log data. Pearson uses this information for system administration and to identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents, appropriately scale computing resources and otherwise support and deliver this site and its services.

Cookies and Related Technologies

This site uses cookies and similar technologies to personalize content, measure traffic patterns, control security, track use and access of information on this site, and provide interest-based messages and advertising. Users can manage and block the use of cookies through their browser. Disabling or blocking certain cookies may limit the functionality of this site.

Do Not Track

This site currently does not respond to Do Not Track signals.


Pearson uses appropriate physical, administrative and technical security measures to protect personal information from unauthorized access, use and disclosure.


This site is not directed to children under the age of 13.


Pearson may send or direct marketing communications to users, provided that

  • Pearson will not use personal information collected or processed as a K-12 school service provider for the purpose of directed or targeted advertising.
  • Such marketing is consistent with applicable law and Pearson's legal obligations.
  • Pearson will not knowingly direct or send marketing communications to an individual who has expressed a preference not to receive marketing.
  • Where required by applicable law, express or implied consent to marketing exists and has not been withdrawn.

Pearson may provide personal information to a third party service provider on a restricted basis to provide marketing solely on behalf of Pearson or an affiliate or customer for whom Pearson is a service provider. Marketing preferences may be changed at any time.

Correcting/Updating Personal Information

If a user's personally identifiable information changes (such as your postal address or email address), we provide a way to correct or update that user's personal data provided to us. This can be done on the Account page. If a user no longer desires our service and desires to delete his or her account, please contact us at customer-service@informit.com and we will process the deletion of a user's account.


Users can always make an informed choice as to whether they should proceed with certain services offered by InformIT. If you choose to remove yourself from our mailing list(s) simply visit the following page and uncheck any communication you no longer want to receive: www.informit.com/u.aspx.

Sale of Personal Information

Pearson does not rent or sell personal information in exchange for any payment of money.

While Pearson does not sell personal information, as defined in Nevada law, Nevada residents may email a request for no sale of their personal information to NevadaDesignatedRequest@pearson.com.

Supplemental Privacy Statement for California Residents

California residents should read our Supplemental privacy statement for California residents in conjunction with this Privacy Notice. The Supplemental privacy statement for California residents explains Pearson's commitment to comply with California law and applies to personal information of California residents collected in connection with this site and the Services.

Sharing and Disclosure

Pearson may disclose personal information, as follows:

  • As required by law.
  • With the consent of the individual (or their parent, if the individual is a minor)
  • In response to a subpoena, court order or legal process, to the extent permitted or required by law
  • To protect the security and safety of individuals, data, assets and systems, consistent with applicable law
  • In connection the sale, joint venture or other transfer of some or all of its company or assets, subject to the provisions of this Privacy Notice
  • To investigate or address actual or suspected fraud or other illegal activities
  • To exercise its legal rights, including enforcement of the Terms of Use for this site or another contract
  • To affiliated Pearson companies and other companies and organizations who perform work for Pearson and are obligated to protect the privacy of personal information consistent with this Privacy Notice
  • To a school, organization, company or government agency, where Pearson collects or processes the personal information in a school setting or on behalf of such organization, company or government agency.


This web site contains links to other sites. Please be aware that we are not responsible for the privacy practices of such other sites. We encourage our users to be aware when they leave our site and to read the privacy statements of each and every web site that collects Personal Information. This privacy statement applies solely to information collected by this web site.

Requests and Contact

Please contact us about this Privacy Notice or if you have any requests or questions relating to the privacy of your personal information.

Changes to this Privacy Notice

We may revise this Privacy Notice through an updated posting. We will identify the effective date of the revision in the posting. Often, updates are made to provide greater clarity or to comply with changes in regulatory requirements. If the updates involve material changes to the collection, protection, use or disclosure of Personal Information, Pearson will provide notice of the change through a conspicuous notice on this site or other appropriate way. Continued use of the site after the effective date of a posted revision evidences acceptance. Please contact us if you have questions or concerns about the Privacy Notice or any objection to any revisions.

Last Update: November 17, 2020