Home > Articles

  • Print
  • + Share This
Like this article? We recommend

Learning Something New

Okay, so let's get back to Andy Clark and his theories that we humans, as dedicated tool users, are cyborgs pretty much from the time we hatch. If each new tool becomes an extension of our minds as soon as we grok it, you might think the fields of education and training would be full of all sorts of exciting haptic applications, ready to shove our little brain waves out in all directions. In many cases, this is true.

Medical training, for example, has embraced haptic thingamajigs with enthusiasm. A suite of medical products from Immersion, for example, allows health care providers to learn in various simulated environments. These products guide practitioners in techniques from intravenous therapies to surgery on virtual human tissues that resist cutting exactly like the real thing.

One recent innovation, developed jointly by the Institut National des Sciences Appliquées and the Hôpital de la Croix Rousse (both in Lyon, France), allows students to deliver a virtual baby. BirthSIM uses a pneumatic drive controlled by a computer to simulate the contractions that occur during the journey of a baby through the mother's pelvis. Haptic feedback sensors register the amount of pressure students exert on the baby's head to train students in proper technique and to record these readings for later evaluation and correction.

Obviously, this simulator is not a mass-market item. And that is fairly common in the haptic devices used in education. For that reason, don't expect to see haptic teaching aids in your local elementary school. Although research promises some delightful devices, funding and demand haven't moved them out into the general market.

A few promising technologies in the realm of virtual learning are being developed at the University of Washington Human Interface Technology (HIT) Lab and the associated HIT Lab in New Zealand. These projects use a HIT Lab technology called ARToolkit that lets users looking through a display see an augmented reality (AR) object—a virtual component perched on a real card that can be held in the hand. One example is the Virtual Calakmul project, a simulation of the remains and artifacts of the important ancient Mayan city discovered in 1931. Another, the MagiPlanet display, allows users to see 3D planets poised above each of a set of planet cards. When these cards are arranged in the proper order to reflect their orbits around the sun, the "planets" begin to orbit.

"While this approach does not provide force feedback, it has a built-in haptic aspect," says Suzanne Weghorst, Assistant Director of Research at the UW HIT Lab. These projects allow "manipulation of ARToolkit markers with virtual objects attached, so there is a kinesthetic/proprioceptive component."

  • + Share This
  • 🔖 Save To Your Account