Home > Articles > Web Development

  • Print
  • + Share This
This chapter is from the book

This chapter is from the book

5.3 Properties

Each DataMapper model is able to persist its data. The kind of data it is able to store is defined through its properties. If you’re using a typical database, these properties correlate with the columns of the model’s corresponding table. Below is an example of a DataMapper model with three properties.

class TastyAnimal
  include DataMapper::Resource

  property :id, Serial
  property :name, String
  property :endangered, TrueClass


In many ways, you can think of properties as persistent accessors. In fact, taking a look into the source of the property method (found in the Model resource we spoke about earlier), we find that a dynamic getter and setter are created using class_eval:

def property(name, type, options = {})
  property = Property.new(self, name, type, options)


  # ...

# ...

# defines the getter for the property
def create_property_getter(property)
  class_eval <<-EOS, _ _FILE_ _, _ _LINE_ _
    def #{property.getter}

  # ...


# defines the setter for the property
def create_property_setter(property)
  unless instance_methods.include?("#{property.name}=")
    class_eval <<-EOS, _ _FILE_ _, _ _LINE_ _
      def #{property.name}=(value)
        attribute_set(#{property.name.inspect}, value)

The most important thing to learn from the source shown above is that properties dynamically create getter and setter methods. Additionally, these methods can end up protected or private through visibility attributes. Finally, the getters and setters produced are not exactly equivalent to attr_reader and attr_writer because of their internal use of the methods attribute_get and attribute_set.

Going back to the Resource source, we can find these two methods manipulating the values of model properties, once again located in Model. You’ll have to excuse this volleying back and forth, but the point of the Resource and Model modules is to separate individual resource methods from those related to the model as a whole.

# @api semiplugin
def attribute_get(name)

# @api semipublic
def attribute_set(name, value)
  properties[name].set(self, value)


def properties

You may have noticed the @api semipublic comment above the getter and setter methods. This is because application developers should not ordinarily need to use these methods. Plugin developers, on the other hand, may need to use them as the easiest way to get and set properties while making sure they are persisted.

For application developers, however, this does bring up one important point: Do not use instance variables to set property values. The reason is that while this will set the object’s value, it will unfortunately short-circuit the model code that is used to track whether a property is dirty. In other words, the property value may not persist later upon save. Instead, you should use the actual property method. Below you’ll find an example with comments that should get the point across.

class Fruit
  include DataMapper::Resource

  property :id, Serial
  property :name, String
  property :eaten, TrueClass

  def eat
    unless eaten?
      # will not persist upon save
      @eaten = true

      # will persist upon save
      eaten = true


Before we describe the extended use of properties, let’s take a look at the database side to understand how persistence works.

5.3.1 Database storage

In order to persist the data of model objects, we need to set up our database for that data to be stored. The default-generated configuration files use a SQLite3 database file called sample_development.db. This setup is perfect for most development scenarios given its quickness to get up and running. With that in mind, we’d say stick with it whenever possible, leaving the alteration of config/database.yml for production or staging environments. Automigrating the DB schema

Databases typically need to be prepped for the data they will store during application development. The process by which DataMapper does this is called automigration, because DataMapper uses the properties listed in your models to automatically create your database schema for you. Using the provided Merb DataMapper rake task, we can automigrate the model that we created earlier and then take a peek inside the database to see what was done:

$ rake db:automigrate
$ sqlite3 sample_development.db
sqlite> .tables
sqlite> .schema
CREATE TABLE "tasty_animals"  ("id" INTEGER NOT NULL
  is_endangered" BOOLEAN);

As you can see, a table with a pluralized and snake-cased name was created for our model, TastyAnimal. Remembering the various properties of the model class, we can also spot corresponding columns inside the schema’s CREATE statement. Note that while Ruby classes were used on the property lines, standard SQL types appear in the database.

The code behind automigration is definitely worth studying, so let’s take a look at the module AutoMigrations, which includes itself within the Model module:

module DataMapper
  module AutoMigrations
    def auto_migrate!(repository_name =


    # @api private
    def auto_migrate_down!(repository_name =

     # repository_name ||= default_repository_name
      repository(repository_name) do |r|

    # @api private
    def auto_migrate_up!(repository_name =

      repository(repository_name) do |r|

    def auto_upgrade!(repository_name =

      repository(repository_name) do |r|
        r.adapter.upgrade_model_storage(r, self)

    Model.send(:include, self)

  end # module AutoMigrations
end # module DataMapper

As you can see, there are two API public class methods you can use with models, auto_migrate! and auto_upgrade!. These effectively call the three adapter methods destroy_model_storage, create_model_storage, and upgrade_model_storage. Let’s go deep into the source and see how these three methods do the heavy lifting:

class DataMapper::Adapters::AbstractAdapter
  module Migration

    def upgrade_model_storage(repository, model)
      table_name = model.storage_name(repository.name)

      if success = create_model_storage(repository,

        return model.properties(repository.name)

      properties = []

        each do |property|

        schema_hash = property_schema_hash(repository,

        next if field_exists?(table_name,

        statement = alter_table_add_column_statement(
          table_name, schema_hash)

        properties << property


    def create_model_storage(repository, model)
      return false if storage_exists?(

      execute(create_table_statement(repository, model))
      # ... create indexes


    def destroy_model_storage(repository, model)
      execute(drop_table_statement(repository, model))


The simplest of these, destroy_model_storage, executes a drop table statement. The create_model_storage method, on the other hand, first checks to see if the model storage already exists, returning false if it does or true if it does not, and consequently has the chance to create it. Finally, upgrade_model_storage is the most complicated of the three. It first attempts to create the storage (effectively testing whether it exists or not) and then attempts to add new columns for new properties. This leaves existing data in place and is perfect if you have simply added properties to a column. Lest this appear to be no more than hand waving, let’s dig even deeper into the methods that the AbstractAdapter uses to create the SQL for these statements:

class DataMapper::Adapters::AbstractAdapter

  # immediately following the previous code

  module SQL

    def alter_table_add_column_statement(table_name,

      "ALTER TABLE "+
      "ADD COLUMN "+

    def create_table_statement(repository, model)
    repository_name = repository.name

      statement = <<-EOS.compress_lines

          repository_name).map { |p|

            property_schema_hash(repository, p))

          } * ', '}

      if (key = model.key(repository_name)).any?
        statement << ", PRIMARY KEY(#{ key.map { |p|
        } * ', '})"

      statement << ')'

    def drop_table_statement(repository, model)

    def property_schema_hash(repository, property)
      schema = self.class.type_map[property.type].
        merge(:name => property.field(repository.name))

      if property.primitive == String &&
      schema[:primitive] != 'TEXT'
        schema[:size] = property.length
      elsif property.primitive == BigDecimal ||
      property.primitive == Float
        schema[:precision] = property.precision
        schema[:scale]     = property.scale

      schema[:nullable?] = property.nullable?
      schema[:serial?]   = property.serial?

      if property.default.nil? ||
          unless property.nullable?
        if property.type.respond_to?(:dump)
          schema[:default] = property.type.dump(
            property.default, property)
          schema[:default] = property.default


    def property_schema_statement(schema)
      statement = quote_column_name(schema[:name])
      statement << " #{schema[:primitive]}"

      if schema[:precision] && schema[:scale]
        statement << "(#{[ :precision, :scale ].map {
          |k| quote_column_value(schema[k])
        } * ','})"
      elsif schema[:size]
        statement << "("+

      statement << ' NOT NULL'
        unless schema[:nullable?]
      statement << " DEFAULT " +
        quote_column_value(schema[:default]) if
  include SQL


The first thing you may notice is that the methods are included within a module called SQL and that the module is included immediately after it is closed. The reason for this is that within DataMapper adapters, code is often organized by use, and thus the encapsulation of private methods into a module easily allows for alternating regions of public and then private methods.

Now, turning to the actual methods, we can see that some of them—for instance, drop_table_statement—are just a line of simple SQL. Likewise, alter_table_column_statement is just a single line that outputs add column statements. The create_table_statement, however, is far more complex, relying on various other methods to get its work done. One of these, properties_with_subclasses, pulls up all model properties, including those that are simply keys used with relationships. We’ll go further into properties_with_subclasses later on when we examine model relationships, but for now let’s take a look at the method property_schema_statement, which quotes the property as a column name and then appends its type. It also adds the appropriate SQL for decimals, non-nullables, and default values.

We hope this has brought you deep enough into the inner workings of automigration to both appreciate its design and get a feel for how adapter code handles the production of SQL more generally. But it would also be nice to be able to use some of it practically, and thankfully you can do so. For instance, if you’re in mid-development, you may fire up interactive Merb and use auto_upgrade! on a model to which you’ve added properties:

> Fruit.auto_upgrade!

Likewise, you may want to refresh the data of a model using auto_migrate! in the middle of a test file. Here’s an example we’ve spotted in the wild:

before :each do

5.3.2 Defining properties

Let’s now take a more rigorous look at properties as well as the options we have while defining them. As we’ve seen, each property is defined on its own line by using the method property. This class method is mixed in via the inclusion of DataMapper::Resource. It takes a minimum of two arguments, the first being a symbol that effectively names the property and the second being a class that defines what type of data is to be stored. As we will see soon, an optional hash of arguments may also be passed in. Property types

While abstracting away the differences across database column types, DataMapper has chosen to stay true as much as possible to using Ruby to describe properties types. Below is a list of the various classes supported by the DataMapper core. Note that the inclusion of DataMapper::Resource will include DM in your model class, and that when defining properties, you will not have to use the module prefix DM:: before those that use it.

  • Class—stores a Ruby Class name as a string. Intended for use with inheritance, primarily through the property type DM::Discriminator.
  • String—stores a Ruby String. Default maximum length is 50 characters. Length can be defined by the optional hash key :length.
  • Integer—stores a Ruby Integer. Length can be defined by the optional hash key :length.
  • BigDecimal—stores a Ruby BigDecimal, intended for numbers where decimal exactitude is necessary. Can use the option hash keys :precision and :scale.
  • Float—stores a Ruby Float. Primarily intended for numbers where decimal exactitude is not critical. Can use the two options hash keys :precision and :scale.
  • Date—stores a Ruby Date.
  • DateTime—stores a Ruby DateTime.
  • Time—stores a Ruby Time.
  • Object—allows for the marshaling of a full object into a record. It is serialized into text upon storage and when retrieved is available as the original object.
  • TrueClass—a Boolean that works with any of the values in the array [0, 1, 't', 'f', true, false]. In MySQL it translates down to a tinyint, in PostgreSQL a bool, and in SQLite a boolean.
  • DM::Boolean—an alias of TrueClass. This is around for legacy DataMapper support, simply to provide a more commonly recognized name for the type.
  • Discriminator—stores the model class name as a string. Used for single-table inheritance.
  • DM::Serial—used on the serial ID of a model. Serial IDs are auto-incremented integers that uniquely apply to single records. Alternatively, a property can use the Integer class and set :serial to true. You will nearly always see this type applied to the id property.
  • DM::Text—stores larger textual data and is notably lazy-loaded by default.

You may be interested in knowing how the casting in and out of property values works. For the primitive types, values coming out of the database are cast using the method Property#typecast. Below we see how this methods prunes results, modifying them into what we want in Ruby.

def typecast(value)
  return type.typecast(value, self) if type.respond_to?(:typecast)
  return value if value.kind_of?(primitive) || value.nil?
    if    primitive == TrueClass
      %w[ true 1 t ].include?(value.to_s.downcase)
    elsif primitive == String
    elsif primitive == Float
    elsif primitive == Integer
      value_to_i = value.to_i
      if value_to_i == 0
        value.to_s =~ /^(0x|0b)?0+/ ? 0 : nil
    elsif primitive == BigDecimal
    elsif primitive == DateTime
    elsif primitive == Date
    elsif primitive == Time
    elsif primitive == Class


Custom types, however, are handled by subclasses of an abstract type class called DataMapper::Type. These load and dump data in whatever way they are programmed to do. We’ll see custom types later on when we examine some DataMapper-type plugins, but for now let’s take a look at one of the custom types from the DataMapper core, Serial:

module DataMapper
  module Types
    class Serial < DataMapper::Type
      primitive Integer
      serial true
    end # class Text
  end # module Types
end # module DataMapper

Note its use of the methods primitive and serial, which are defined in the class DataMapper::Type:

class DataMapper:Type
    :accessor, :reader, :writer,
    :lazy, :default, :nullable, :key, :serial, :field,
    :size, :length, :format, :index, :unique_index,
    :check, :ordinal, :auto_validation, :validates,
    :unique, :track, :precision, :scale

  # ...

  class << self

    PROPERTY_OPTIONS.each do |property_option|
      self.class_eval <<-EOS, _ _FILE_ _, _ _LINE_ _
        def #{property_option}(arg = nil)
          return @#{property_option} if arg.nil?

          @#{property_option} = arg

    def primitive(primitive = nil)
      return @primitive if primitive.nil?
      @primitive = primitive

    # ...


From this we can first see that the primitive method sets the type to which the property value should be dumped. The serial method, on the other hand, is an example of the property option, which we’re about to address. Option hash

The third argument that the property method can take is an option hash, which affects various behavioral aspects of the property. For instance, below we’ve specified that a property should default to some value.

class Website
  include DataMapper::Resource

  property :id, Serial
  property :domain, String
  property :color_scheme, String, :default => 'blue'

Here’s a list of the various property options and their uses:

  • :accessor—takes the value :private, :protected, or :public. Sets the access privileges of the property as both a reader and a writer. Defaults to :public.
  • :reader—takes the value :private, :protected, or :public. Sets the access privileges of the property as a reader. Defaults to :public.
  • :writer—takes the value :private, :protected, or :public. Sets the access privileges of the property as a writer. Defaults to :public.
  • :lazy—determines whether the property should be lazy-loaded or not. Lazy-loaded properties are not read from the repository unless they are used. Defaults to false on most properties, but is notably true on DM::Text.
  • :default—sets the default value of the property. Can take any value appropriate for the type.
  • :nullable—if set to true it will disallow a null value for the property. When dm-validations is used this invalidates a model.
  • :key—defines a property as the table key. This allows for natural keys in place of a serial ID. This key can be used as the index on the model class in order to access the record.
  • :serial—sets the property to be auto-incremented as well as to serve as the table key.
  • :field—manually overrides the field name. Best used for legacy repositories.
  • :size—sets the size of the property type.
  • :length—alias of :size.
  • :format—used with the String property type. When used with a dmvalidations format can set a regular expression against which strings must validate.
  • :index—sets the property to be indexed for faster retrieval. If set to a symbol instead of to true, it can be used to create multicolumn indexes.
  • :unique_index—defines a unique index for the property. When used with dmvalidations, new records with nonunique property values are marked invalid. If set to a symbol instead of true, it can be used to create multicolumn indexes.
  • :auto_validation—when used with dm-validations, can be used to turn off autovalidations by using the value true.
  • :track—determines when a property should be tracked for dirtiness. Takes the values :get, :set, :load, and :hash.
  • :precision—sets the number of decimal places allowed for BigDecimal and Float type properties.
  • :scale—sets the number of decimal places after the decimal point for BigDecimal and Float type properties.
  • + Share This
  • 🔖 Save To Your Account