Home > Articles

This chapter is from the book

Representative Applications of Haptics

Surgical Simulation and Medical Training

A primary application area for haptics has been in surgical simulation and medical training. Langrana, Burdea, Ladeiji, and Dinsmore (1997) used the Rutgers Master II haptic device in a training simulation for palpation of subsurface liver tumors. They modeled tumors as comparatively harder spheres within larger and softer spheres. Realistic reaction forces were returned to the user as the virtual hand encountered the "tumors," and the graphical display showed corresponding tissue deformation produced by the palpation. Finite Element Analysis was used to calculate the reaction forces corresponding to deformation from experimentally obtained force/deflection curves. Researchers at the Universidade Catolica de Brasilia-Brasil (D'Aulignac & Balaniuk, 1999) have produced a physical simulation system providing graphic and haptic interfaces for an echographic examination of the human thigh, using a spring damper model defined from experimental data. Machaco, Moraes and Zuffo (2000) have used haptics in an immersive simulator of bone marrow harvest for transplant. Andrew Mor of the Robotics Institute at Carnegie Mellon (Mor, 1998) employed the PHANToM in conjunction with a 2 DOF planar device in an arthroscopic surgery simulation. The new device generates a moment measured about the tip of a surgical tool, thus providing more realistic training for the kinds of unintentional contacts with ligaments and fibrous membranes that an inexperienced resident might encounter. At Stanford, Balaniuk and Costa (2000) have developed a method to simulate fluid-filled objects suitable for interactive deformation by "cutting," "suturing," and so on. At MIT, De and Srinivasan (1998) have developed models and algorithms for reducing the computational load required to generate visual rendering of organ motion and deformation and the communication of forces back to the user resulting from tool-tissue contact. They model soft tissue as thin-walled membranes filled with fluid. Force-displacement response is comparable to that obtained in in vivo experiments. At Berkeley, Sastry and his colleagues (Chapter 13, this volume) are engaged in a joint project with the surgery department of the University of California at San Francisco and the Endorobotics Corporation to build dexterous robots for use inside laparoscopic and endoscopic cannulas, as well as tactile sensing and teletactile display devices and masters for surgical teleoperation (2001). Aviles and Ranta of Novint Technologies have developed the Virtual Reality Dental Training System dental simulator (Aviles & Ranta, 1999). They employ a PHANToM with four tips that mimic dental instruments; they can be used to explore simulated materials like hard tooth enamel or dentin. Giess, Evers, and Meinzer (1998) integrated haptic volume rendering with the PHANToM into the presurgical process of classifying liver parenchyma, vessel trees, and tumors. Surgeons at the Pennsylvania State University School of Medicine, in collaboration with Cambridge-based Boston Dynamics, used two PHANToMs in a training simulation in which residents passed simulated needles through blood vessels, allowing them to collect baseline data on the surgical skill of new trainees. Iwata, Yano, and Hashimoto (1998) report the development of a surgical simulator with a "free form tissue" which can be "cut" like real tissue. There are few accounts of any systematic testing and evaluation of the simulators described above. Gruener (1998), in one of the few research reports with hard data, expresses reservations about the potential of haptics in medical applications; he found that subjects in a telementoring session did not profit from the addition of force feedback to remote ultrasound diagnosis.

Museum Display

Although it is not yet commonplace, a few museums are exploring methods for 3D digitization of priceless artifacts and objects from their sculpture and decorative arts collections, making the images available via CD-ROM or in-house kiosks. For example, the Canadian Museum of Civilization collaborated with Ontario-based Hymarc to use the latter's ColorScan 3D laser camera to create three-dimensional models of objects from the museum's collection (Canarie, Inc., 1998; Shulman, 1998). A similar partnership was formed between the Smithsonian Institution and Synthonic Technologies, a Los Angeles-area company. At Florida State University, the Department of Classics has worked with a team to digitize Etruscan artifacts using the RealScan 3D imaging system from Real 3D (Orlando, Florida), and art historians from Temple University have collaborated with researchers from the Watson Research Laboratory's visual and geometric computing group to create a model of Michaelangelo's Pieta, using the Virtuoso shape camera from Visual Interface (Shulman, 1998).

Few museums have yet explored the potential of haptics to allow visitors access to three-dimensional museum objects such as sculpture, bronzes, or examples from the decorative arts. The "hands-off" policies that museums must impose limit appreciation of three-dimensional objects, where full comprehension and understanding rely on the sense of touch as well as vision. Haptic interfaces can allow fuller appreciation of three-dimensional objects without jeopardizing conservation standards, giving museums, research institutes, and other conservators of priceless objects a way to provide the public with a vehicle for object exploration in a modality that could not otherwise be permitted (McLaughlin, Goldberg, Ellison, & Lucas, 1999). At the University of Southern California, researchers at the Integrated Media Systems Center (IMSC) have digitized daguerreotype cases from the collection of the Seaver Center for Western Culture at the Natural History Museum of Los Angeles County and made them available at a PHANToM-equipped kiosk alongside an exhibition of the "real" objects (see Chapter 15, this volume). Bergamasco, Jannson and colleagues (Jansson, 2001) are undertaking a "Museum of Pure Form"; their group will acquire selected sculptures from the collections of partner museums in a network of European cultural institutions to create a digital database of works of art for haptic exploration.

Haptics raises the prospect of offering museum visitors not only the opportunity to examine and manipulate digitized three-dimensional art objects visually, but also to interact remotely, in real time, with museum staff members to engage in joint tactile exploration of the works of art such that someone from the museum's curatorial staff can interact with a student in a remote classroom and together they can jointly examine an ancient pot or bronze figure, note its interesting contours and textures, and consider such questions as "What is the mark at the base of the pot?" or "Why does this side have such jagged edges?" (Hespanha, Sukhatme, McLaughlin, Akbarian, Garg, & Zhu, 2000; McLaughlin, Sukhatme, Hespanha, Shahabi, Ortega, & Medioni, 2000; Sukhatme, Hespanha, McLaughlin, Shahabi, & Ortega, 2000).

Painting, Sculpting, and CAD

There have been a few projects in which haptic displays are used as alternative input devices for painting, sculpting, and computer-assisted design (CAD). Dillon and colleagues (Dillon, Moody, Bartlett, Scully, Morgan, & James, 2000) are developing a "fabric language" to analyze the tactile properties of fabrics as an information resource for haptic fabric sensing. At CERTEC, the Center of Rehabilitation Engineering in Lund, Sweden, Sjostrom (Sjostrom, 1997) and his colleagues have created a painting application in which the PHANToM can be used by the visually impaired; line thickness varies with the user's force on the fingertip thimble and colors are discriminated by their tactual profile. At Dartmouth, Henle and Donald (1999) developed an application in which animations are treated as palpable vector fields that can be edited by manipulation with the PHANToM. Marcy, Temkin, Gorman, and Krummel (1998) have developed the Tactile Max, a PHANToM plug-in for 3D Studio Max. Dynasculpt, a prototype from Interval Research Corporation (Snibbe, Anderson, & Verplank, 1998) permits sculpting in three dimensions by attaching a virtual mass to the PHANToM position and constructing a ribbon through the path of the mass through the 3D space. Gutierrez, Barbero, Aizpitarte, Carrillo, and Eguidazu (1998) have integrated the PHANToM into DATum, a geometric modeler. Objects can be touched, moved, or grasped (with two PHANToMs), and the assembly/disassembly of mechanical objects can be simulated.

Visualization

Haptics has also been incorporated into scientific visualization. Durbeck, Macias, Weinstein, Johnson, and Hollerbach (1998) have interfaced SCIrun, a computation software steering system, to the PHANToM. Both haptics and graphics displays are directed by the movement of the PHANToM stylus through haptically rendered data volumes. Similar systems have been developed for geoscientific applications (e.g., the Haptic Workbench, Veldkamp, Truner, Gunn, and Stevenson, 1998). Green and Salisbury (1998) have produced a convincing soil simulation in which they have varied parameters such as soil properties, plow blade geometry, and angle of attack. Researchers at West Virginia University (Van Scoy, Baker, Gingold, Martino, & Burton, 1999) have applied haptics to mobility training. They designed an application in which a real city block and its buildings could be explored with the PHANToM, using models of the buildings created in CANOMA from digital photographs of the scene from the streets. At Interactive Simulations, a San Diego-based company, researchers have added a haptic feedback component to Sculpt, a program for analyzing chemical and biological molecular structures, which will permit analysis of molecular conformational flexibility and interactive docking. At the University of North Carolina, Chapel Hill (Chapter 5, this volume), 6 DOF PHANToMs have been used for haptic rendering of high-dimensional scientific datasets, including three-dimensional force fields and tetrahedralized human head volume datasets. We consider further applications of haptics to visualization below, in the section "Assistive Technology for the Blind and Visually Impaired."

Military Applications

Haptics has also been used in aerospace and military training and simulations. There are a number of circumstances in a military context in which haptics can provide a useful substitute information source; that is, there are circumstances in which the modality of touch could convey information that for one reason or another is not available, not reliably communicated, nor even best apprehended through the modalities of sound and vision. In some cases, combatants may have their view blocked or may not be able to divert attention from a display to attend to other information sources. Battlefield conditions, such as the presence of artillery fire or smoke, might make it difficult to hear or see. Conditions might necessitate that communications be inaudible (Transdimension, 2000). For certain applications, for example where terrain or texture information needs to be conveyed, haptics may be the most efficient communication channel. In circumstances like those described above, haptics is an alternative modality to sound and vision that can be exploited to provide low-bandwidth situation information, commands, and threat warning (Transdimension, 2000). In other circumstances haptics could function as a supplemental information source to sound or vision. For example, users can be alerted haptically to interesting portions of a military simulation, learning quickly and intuitively about objects, their motions, what persons may interact with them, and so on.

At the Army's National Automotive Center, the SimTLC (Simulation Throughout the Life Cycle) program has used VR techniques to test military ground vehicles under simulated battlefield conditions. One of the applications has been a simulation of a distributed environment where workers at remote locations can collaborate in reconfiguring a single vehicle chassis with different weapons components, using instrumented force-feedback gloves to manipulate the three-dimensional components (National Automotive Center, 1999). The SIRE simulator (Synthesized Immersion Research Environment) at the Air Force Research Laboratory, Wright-Patterson Air Force Base, incorporated data gloves and tactile displays into its program of development and testing of crew station technologies (Wright-Patterson Air Force Base, 1997). Using tasks such as mechanical assembly, researchers at NASA-Ames have been conducting psychophysical studies of the effects of adding a 3 DOF force-feedback manipulandum to a visual display, noting that control and system dynamics have received ample research attention but that the human factors underlying successful haptic display in simulated environments remain to be identified (Ellis & Adelstein, n.d.). The Naval Aerospace Medical Research Laboratory has developed a "Tactile Situation Awareness System" for providing accurate orientation information in land, sea, and aerospace environments. One application of the system is to alleviate problems related to the spatial disorientation that occurs when a pilot incorrectly perceives the attitude, altitude, or motion of his aircraft; some of this error may be attributable to momentary distraction, reduced visibility, or an increased workload. Because the system (a vibrotactile transducer) can be attached to a portable sensor, it can also be used in such applications as extravehicular space exploration activity or Special Forces operations. Among the benefits claimed for integration of haptics with audio and visual displays are increased situation awareness, the ability to track targets and information sources spatially, and silent communication under conditions where sound is not possible or desirable (e.g., hostile environments) (Naval Aerospace Medical Research Laboratory, 2000).

Interaction Techniques

An obvious application of haptics is to the user interface, in particular its repertoire of interaction techniques, loosely considered that set of procedures by which basic tasks, such as opening and closing windows, scrolling, and selecting from a menu, are performed (Kirkpatrick & Douglas, 1999). Indeed, interaction techniques have been a popular application area for 2D haptic mice like the Wingman and I-Feel, which work with the Windows interface to add force feedback to windows, scroll bars, and the like. For some of these force-feedback mice, shapes, textures, and other properties of objects (spring, damping) can be "rendered" with Javascript and the objects delivered for exploration with the haptic mice via standard Web pages. Haptics offers a natural user interface based on the human gestural system. The resistance and friction provided by stylus-based force feedback adds an intuitive feel to such everyday tasks as dragging, sliding levers, and depressing buttons. There are more complex operations, such as concatenating or editing, for which a grasping metaphor may be appropriate. Here the whole-hand force feedback provided by glove-based devices could convey the feeling of stacking or juxtaposing several objects or of plucking an unwanted element from a single object. The inclusion of palpable physics in virtual environments, such as the constraints imposed by walls or the effect of altered gravity on weight, may enhance the success of a user's interaction with the environment (Adelstein & Ellis, 2000).

Sometimes too much freedom to move is inefficient and has users going down wrong paths and making unnecessary errors that system designers could help them avoid by the appropriate use of built-in force constraints that encourage or require the user to do things in the "right" way (Hutchins & Gunn, 1999). Haptics can also be used to constrain the user's interaction with screen elements, for example, by steering him or her away from unproductive areas for the performance of specific tasks, or making it more difficult to trigger procedures accidentally by increasing the stiffness of the controls.

Assistive Technology for the Blind and Visually Impaired

Most haptic systems still rely heavily on a combined visual/haptic interface. This dual modality is very forgiving in terms of the quality of the haptic rendering. This is because ordinarily the user is able to see the object being touched and naturally persuades herself that the force feedback coming from the haptic device closely matches the visual input. However, in most current haptic interfaces, the quality of haptic rendering is actually poor and, if the user closes her eyes, she will only be able to distinguish between very simple shapes (such as balls, cubes, etc.).

To date there has been a modest amount of work on the use of machine haptics for the blind and visually impaired. Among the two-dimensional haptic devices potentially useful in this context, the most recent are the Moose, the Wingman, the I-Feel, and the Sidewinder. The Moose, a 2D haptic interface developed at Stanford (O'Modhrain & Gillespie, 1998), reinterprets a Windows screen with force feedback such that icons, scroll bars, and other screen elements like the edges of windows are rendered haptically, providing an alternative to the conventional graphical user interface (GUI). For example, drag-and-drop operations are realized by increasing or decreasing the apparent mass of the Moose's manipulandum. Although not designed specifically with blind users in mind, the Logitech Wingman, developed by Immersion Corporation and formerly known as the "FEELit" mouse, similarly renders the Windows screen haptically in two dimensions and works with the Web as well, allowing the user to "snap to" hyperlinks or feel the "texture" of a textile using a "FeeltheWeb" ActiveX control programmed through Javascript. (The Wingman mouse is now no longer commercially available). Swedish researchers have experimented, with mixed results, with two-dimensional haptic devices like the Microsoft Sidewinder joystick in games devised for the visually impaired, such as "Labyrinth," in which users negotiate a maze using force feedback (Johansson & Linde, 1998, 1999).

Among the three-dimensional haptic devices, Immersion's Impulse Engine 3000 has been shown to be an effective display system for blind users. Colwell et al. (1998) had blind and sighted subjects make magnitude estimations of the roughness of virtual textures using the Impulse Engine and found that the blind subjects were more discriminating with respect to the roughness of texture and had different mental maps of the location of the haptic probe relative to the virtual object than sighted users. The researchers found, however, that for complex virtual objects, such as models of sofas and chairs, haptic information was simply not sufficient to produce recognition and had to be supplemented with information from other sources for all users.

Most of the recent work in 3D haptics for the blind has tended to focus on SensAble's PHANToM. At CERTEC, the Center of Rehabilitation Engineering in Lund, Sweden, in addition to Sjöstrom's painting application, described earlier (Sjöstrom, 1997), a program has been developed for "feeling" mathematical curves and surfaces, and a variant of the game "Battleship" that uses force feedback to communicate the different sensations of the "water surface" as bombs are dropped and opponents are sunk. The game is one of the few that can also be enjoyed by deaf-blind children. Blind but hearing children may play "The Memory Game," a variation on "Concentration" based on sound-pair buttons that disappear tactually when a match is made (Rassmuss-Gröhn & Sjöstrom, 1998).

Jansson and his colleagues at Uppsala University in Sweden have been at the forefront of research on haptics for the blind (Jannson, 1998; Jansson & Billberger, 1999; Jansson, Faenger, Konig, & Billberger, 1998). Representive of this work is an experiment reported in Jansson and Billberger (1999), in which blindfolded subjects were evaluated for speed and accuracy in identifying virtual objects (cubes, spheres, cylinders, and cones) with the PHANToM and corresponding physical models of the virtual objects by hand exploration. Jansson and Billberger found that both speed and accuracy in shape identification were significantly poorer for the virtual objects. Speed in particular was affected by virtue of the fact that the exploratory procedures most natural to shape identification, grasping and manipulating with both hands, could not be emulated by the single-point contact of the PHANToM tip. They also noted that subject performance was not affected by the type of PHANToM interface (thimble versus stylus). However, shape recognition of virtual objects with the PHANToM was significantly influenced by the size of the object, with larger objects being more readily identified. The authors noted that shape identification with the PHANToM is a considerably more difficult task than texture recognition, in that in the case of the latter a single lateral sweep of the tip in one direction may be sufficient, but more complex procedures are required to apprehend shape. In Chapter 9 of this volume Jansson reports on his work with nonrealistic haptic rendering and with the method of successive presentation of increasingly complex scenes for haptic perception when visual guidance is unavailable.

Multivis (Multimodal Visualization for Blind People) is a project currently being undertaken at the University of Glasgow, which will utilize force feedback, 3D sound rendering, braille, and speech input and output to provide blind users access to complex visual displays. Yu, Ramloll, and Brewster (2000) have developed a multimodal approach to providing blind users access to complex graphical data such as line graphs and bar charts. Among their techniques are the use of "haptic gridlines" to help users locate data values on the graphs. Different lines are distinguished by applying two levels of surface friction to them ("sticky" or "slippery"). Because these features have not been found to be uniformly helpful to blind users, a toggle feature was added so that the gridlines and surface friction could be turned on and off. Subjects in their studies had to use the PHANToM to estimate the x and y coordinates of the minimum and maximum points on two lines. Both blind and sighted subjects were effective at distinguishing lines by their surface friction. Gridlines, however, were sometimes confused with the other lines, and counting the gridlines from right and left margins was a tedious process prone to error. The authors recommended, based on their observations, that lines on a graph should be modeled as grooved rather than raised ("engraving" rather than "embossing"), as the PHANToM tip "slips off" the raised surface of the line.

Ramloll, Yu, and their colleagues (2000) note that previous work on alternatives to graphical visualization indicates that for blind persons, pitch is an effective indicator of the location of a point with respect to an axis. Spatial audio is used to assist the user in tasks such as detecting the current location of the PHANToM tip relative to the origin of a curve (Ramloll, Yu, et al., 2000). Pitches corresponding to the coordinates of the axes can be played in rapid succession to give an "overview" picture of the shape of the curve. Such global information is useful in gaining a quick overall orientation to the graph that purely local information can provide only slowly, over time. Ramloll et al. also recommend a guided haptic overview of the borders, axes, and curves—for example, at intersections of axes, applying a force in the current direction of motion along a curve to make sure that the user does not go off in the wrong direction.

Other researchers working in the area of joint haptic-sonification techniques for visualization for the blind include Grabowski and Barner (Grabowski, 1999; Grabowski & Barner, 1998). In this work, auditory feedback—physically modeled impact sound—is integrated with the PHANToM interface. For instance, sound and haptics are integrated such that a virtual object will produce an appropriate sound when struck. The sound varies depending on such factors as the energy of the impact, its location, and the user's distance from the object (Grabowski, 1999).

InformIT Promotional Mailings & Special Offers

I would like to receive exclusive offers and hear about products from InformIT and its family of brands. I can unsubscribe at any time.

Overview


Pearson Education, Inc., 221 River Street, Hoboken, New Jersey 07030, (Pearson) presents this site to provide information about products and services that can be purchased through this site.

This privacy notice provides an overview of our commitment to privacy and describes how we collect, protect, use and share personal information collected through this site. Please note that other Pearson websites and online products and services have their own separate privacy policies.

Collection and Use of Information


To conduct business and deliver products and services, Pearson collects and uses personal information in several ways in connection with this site, including:

Questions and Inquiries

For inquiries and questions, we collect the inquiry or question, together with name, contact details (email address, phone number and mailing address) and any other additional information voluntarily submitted to us through a Contact Us form or an email. We use this information to address the inquiry and respond to the question.

Online Store

For orders and purchases placed through our online store on this site, we collect order details, name, institution name and address (if applicable), email address, phone number, shipping and billing addresses, credit/debit card information, shipping options and any instructions. We use this information to complete transactions, fulfill orders, communicate with individuals placing orders or visiting the online store, and for related purposes.

Surveys

Pearson may offer opportunities to provide feedback or participate in surveys, including surveys evaluating Pearson products, services or sites. Participation is voluntary. Pearson collects information requested in the survey questions and uses the information to evaluate, support, maintain and improve products, services or sites, develop new products and services, conduct educational research and for other purposes specified in the survey.

Contests and Drawings

Occasionally, we may sponsor a contest or drawing. Participation is optional. Pearson collects name, contact information and other information specified on the entry form for the contest or drawing to conduct the contest or drawing. Pearson may collect additional personal information from the winners of a contest or drawing in order to award the prize and for tax reporting purposes, as required by law.

Newsletters

If you have elected to receive email newsletters or promotional mailings and special offers but want to unsubscribe, simply email information@informit.com.

Service Announcements

On rare occasions it is necessary to send out a strictly service related announcement. For instance, if our service is temporarily suspended for maintenance we might send users an email. Generally, users may not opt-out of these communications, though they can deactivate their account information. However, these communications are not promotional in nature.

Customer Service

We communicate with users on a regular basis to provide requested services and in regard to issues relating to their account we reply via email or phone in accordance with the users' wishes when a user submits their information through our Contact Us form.

Other Collection and Use of Information


Application and System Logs

Pearson automatically collects log data to help ensure the delivery, availability and security of this site. Log data may include technical information about how a user or visitor connected to this site, such as browser type, type of computer/device, operating system, internet service provider and IP address. We use this information for support purposes and to monitor the health of the site, identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents and appropriately scale computing resources.

Web Analytics

Pearson may use third party web trend analytical services, including Google Analytics, to collect visitor information, such as IP addresses, browser types, referring pages, pages visited and time spent on a particular site. While these analytical services collect and report information on an anonymous basis, they may use cookies to gather web trend information. The information gathered may enable Pearson (but not the third party web trend services) to link information with application and system log data. Pearson uses this information for system administration and to identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents, appropriately scale computing resources and otherwise support and deliver this site and its services.

Cookies and Related Technologies

This site uses cookies and similar technologies to personalize content, measure traffic patterns, control security, track use and access of information on this site, and provide interest-based messages and advertising. Users can manage and block the use of cookies through their browser. Disabling or blocking certain cookies may limit the functionality of this site.

Do Not Track

This site currently does not respond to Do Not Track signals.

Security


Pearson uses appropriate physical, administrative and technical security measures to protect personal information from unauthorized access, use and disclosure.

Children


This site is not directed to children under the age of 13.

Marketing


Pearson may send or direct marketing communications to users, provided that

  • Pearson will not use personal information collected or processed as a K-12 school service provider for the purpose of directed or targeted advertising.
  • Such marketing is consistent with applicable law and Pearson's legal obligations.
  • Pearson will not knowingly direct or send marketing communications to an individual who has expressed a preference not to receive marketing.
  • Where required by applicable law, express or implied consent to marketing exists and has not been withdrawn.

Pearson may provide personal information to a third party service provider on a restricted basis to provide marketing solely on behalf of Pearson or an affiliate or customer for whom Pearson is a service provider. Marketing preferences may be changed at any time.

Correcting/Updating Personal Information


If a user's personally identifiable information changes (such as your postal address or email address), we provide a way to correct or update that user's personal data provided to us. This can be done on the Account page. If a user no longer desires our service and desires to delete his or her account, please contact us at customer-service@informit.com and we will process the deletion of a user's account.

Choice/Opt-out


Users can always make an informed choice as to whether they should proceed with certain services offered by InformIT. If you choose to remove yourself from our mailing list(s) simply visit the following page and uncheck any communication you no longer want to receive: www.informit.com/u.aspx.

Sale of Personal Information


Pearson does not rent or sell personal information in exchange for any payment of money.

While Pearson does not sell personal information, as defined in Nevada law, Nevada residents may email a request for no sale of their personal information to NevadaDesignatedRequest@pearson.com.

Supplemental Privacy Statement for California Residents


California residents should read our Supplemental privacy statement for California residents in conjunction with this Privacy Notice. The Supplemental privacy statement for California residents explains Pearson's commitment to comply with California law and applies to personal information of California residents collected in connection with this site and the Services.

Sharing and Disclosure


Pearson may disclose personal information, as follows:

  • As required by law.
  • With the consent of the individual (or their parent, if the individual is a minor)
  • In response to a subpoena, court order or legal process, to the extent permitted or required by law
  • To protect the security and safety of individuals, data, assets and systems, consistent with applicable law
  • In connection the sale, joint venture or other transfer of some or all of its company or assets, subject to the provisions of this Privacy Notice
  • To investigate or address actual or suspected fraud or other illegal activities
  • To exercise its legal rights, including enforcement of the Terms of Use for this site or another contract
  • To affiliated Pearson companies and other companies and organizations who perform work for Pearson and are obligated to protect the privacy of personal information consistent with this Privacy Notice
  • To a school, organization, company or government agency, where Pearson collects or processes the personal information in a school setting or on behalf of such organization, company or government agency.

Links


This web site contains links to other sites. Please be aware that we are not responsible for the privacy practices of such other sites. We encourage our users to be aware when they leave our site and to read the privacy statements of each and every web site that collects Personal Information. This privacy statement applies solely to information collected by this web site.

Requests and Contact


Please contact us about this Privacy Notice or if you have any requests or questions relating to the privacy of your personal information.

Changes to this Privacy Notice


We may revise this Privacy Notice through an updated posting. We will identify the effective date of the revision in the posting. Often, updates are made to provide greater clarity or to comply with changes in regulatory requirements. If the updates involve material changes to the collection, protection, use or disclosure of Personal Information, Pearson will provide notice of the change through a conspicuous notice on this site or other appropriate way. Continued use of the site after the effective date of a posted revision evidences acceptance. Please contact us if you have questions or concerns about the Privacy Notice or any objection to any revisions.

Last Update: November 17, 2020