Home > Articles

Introduction to Haptics: How Devices Can Emulate Touch

  • Print
  • + Share This
Researchers working in the field of haptics are concerned with the development, testing, and refinement of tactile and force feedback devices and supporting software that permit users to sense ("feel") and manipulate three-dimensional virtual objects with respect to such features as shape, weight, surface textures, and temperature. Learn how can a device emulate the sense of touch.
This chapter is from the book

Haptics refers to the modality of touch and associated sensory feedback. Researchers working in the area are concerned with the development, testing, and refinement of tactile and force feedback devices and supporting software that permit users to sense ("feel") and manipulate three-dimensional virtual objects with respect to such features as shape, weight, surface textures, and temperature. In addition to basic psychophysical research on human haptics, and issues in machine haptics such as collision detection, force feedback, and haptic data compression, work is being done in application areas such as surgical simulation, medical training, scientific visualization, and assistive technology for the blind and visually impaired.

How can a device emulate the sense of touch? Let us consider one of the devices from SensAble Technologies. The 3 DOF (degrees-of-freedom) PHANToM is a small robot arm with three revolute joints, each connected to a computer-controlled electric DC motor. The tip of the device is attached to a stylus that is held by the user. By sending appropriate voltages to the motors, it is possible to exert up to 1.5 pounds of force at the tip of the stylus, in any direction.

The basic principle behind haptic rendering is simple: Every millisecond or so, the computer that controls the PHANToM reads the joint encoders to determine the precise position of the stylus. It then compares this position to those of the virtual objects the user is trying to touch. If the user is away from all the virtual objects, a zero voltage is sent to the motors and the user is free to move the stylus (as if exploring empty space). However, if the system detects a collision between the stylus and one of the virtual objects, it drives the motors so as to exert on the user's hand (through the stylus) a force along the exterior normal to the surface being penetrated. In practice, the user is prevented from penetrating the virtual object just as if the stylus collided with a real object that transmits a reaction to the user's hand. Different haptic devices—such as Immersion Corporation's CyberGrasp—operate under the same principle but with different mechanical actuation systems for force generation.

Although the basic principles behind haptics are simple, there are significant technical challenges, such as the construction of the physical devices (cf. Chapter 4), real-time collision detection (cf. Chapters 2 and 5), simulation of complex mechanical systems for precise computation of the reaction forces (cf. Chapter 2), and force control (cf. Chapters 3 and 5). Below we provide an overview of haptics research; we consider haptic devices, applications, haptic rendering, and human factors issues.

Haptic Devices

Researchers have been interested in the potential of force feedback devices such as stylus-based masters like SensAble's PHANToM (Salisbury, Brock, Massie, Swarup, & Zilles, 1995; Salisbury & Massie, 1994) as alternative or supplemental input devices to the mouse, keyboard, or joystick. As discussed above, the PHANToM is a small, desk-grounded robot that permits simulation of single fingertip contact with virtual objects through a thimble or stylus. It tracks the x, y, and z Cartesian coordinates and pitch, roll, and yaw of the virtual probe as it moves about a three-dimensional workspace, and its actuators communicate forces back to the user's fingertips as it detects collisions with virtual objects, simulating the sense of touch. The CyberGrasp, from Immersion Corporation, is an exoskeletal device that fits over a 22 DOF CyberGlove, providing force feedback. The CyberGrasp is used in conjunction with a position tracker to measure the position and orientation of the forearm in three-dimensional space. (A newly released model of the CyberGrasp is self-contained and does not require an external tracker.) Similar to the CyberGrasp is the Rutgers Master II (Burdea, 1996; Gomez, 1998; Langrana, Burdea, Ladeiji, & Dinsmore, 1997), which has an actuator platform mounted on the palm that gives force feedback to four fingers. Position tracking is done by the Polhmeus Fastrak.

Alternative approaches to haptic sensing have employed vibrotactile display, which applies multiple small force vectors to the fingertip. For example, Ikei, Wakamatsu, and Fukuda (1997) used photographs of objects and a contact pin array to transmit tactile sensations of the surface of objects. Each pin in the array vibrates commensurate with the local intensity (brightness) of the surface area, with image intensity roughly correlated with the height of texture protrusions. There is currently a joint effort underway at MIT and Carnegie Mellon (Srinivasan, 2001) to explore the incorporation of microelectromechanical systems (MEMS) actuator arrays into haptic devices and wearables. Researchers at the University of Wisconsin are experimenting with tactile strips containing an array of sensors that can be attached to various kinds of force-feedback devices (Tyler, 2001). Howe (1994) notes that vibrations are particularly helpful in certain kinds of sensing tasks, such as assessing surface roughness or detecting system events (for example, contact and slip in manipulation control).

Researchers at the Fraunhofer Institute for Computer Graphics in Darmstadt have developed a glove-like device they call the ThermoPad, a haptic temperature display based on Peltier elements and simple heat transfer models. They are able to simulate not only the "environmental" temperature but also the sensation of heat or cold one experiences when grasping or colliding with an object. At the University of Tsukuba in Japan, Iwata, Yano, and Hashimoto (1997) are using the HapticMaster, a 6 DOF device with a ball grip that can be replaced by various real tools for surgical simulations and other specialized applications. A novel type of haptic display is the Haptic Screen (Iwata, Yano, and Hashimoto, 1997), a device with a rubberized elastic surface with actuators, each with force sensors, underneath. The surface of the Haptic Screen can be deformed with the naked hand. An electromagnetic interface couples the ISU Force Reflecting Exoskeleton (Luecke & Chai, 1997), developed at Iowa State University, to the operator's two fingers, eliminating the burdensome heaviness usually associated with exoskeletal devices. Researchers at the University of California, Berkeley (Chapter 4, this volume) developed a high performance 3 DOF hand-scale haptic interface. Their device exhibits high stiffness due to a 10-link design with three closed kinematic loops and a direct-drive mechanical system that avoids transmission elements.

Finally, there is considerable interest in 2D haptic devices. For example, Pai and Reissell at the University of British Columbia have used the Pantograph 2D haptic interface, a two-DOF force-feedback planar device with a handle the user moves like a mouse, to feel the edges of shapes in images (Pai & Reissell, 1997). A new 2D haptic mouse from Immersion Corporation is based on optical technology and is free-moving rather than tethered to a base. Other 2D devices, like the Moose, Sidewinder, and the Wingman mouse, are described below in the section on "Assistive Technology for the Blind and Visually Impaired."

  • + Share This
  • 🔖 Save To Your Account