Some of the most advanced virtual and augmented reality systems and applications found anywhere in the world are located in NASA laboratories spread across the United States in support of manned and unmanned space operations. Significant time, effort, and expense have been put toward developing a host of facilities and tools that are now used to train every U.S. astronaut who travels to space. Most of that training takes place in the Virtual Reality Laboratory (VRL) at NASA’s Johnson Space Center in Houston, Texas, a snapshot from which is shown in Figure 17.14.
Figure 17.14 This image shows NASA astronaut Michael Fincke using virtual reality hardware in the Space Vehicle Mock-up Facility at NASA’s Johnson Space Center.
Credit: Image courtesy of NASA
In addition to years of traditional training and preparation, NASA makes extensive use of immersive virtual reality systems and related technologies to train the astronauts in four primary areas:
Extra-Vehicular Activity (EVA) Training, which prepares the astronauts for space walks and the tasks they will be performing while outside of the International Space Station (ISS).
Simplified Aid for EVA Rescue (SAFER) Training, which teaches the astronauts how to use a small, self-contained propulsive backpack system worn during spacewalks. In the event that an astronaut becomes detached from the ISS or a spacecraft and floats out of reach, SAFER provides a means of self rescue.
Robotics Operations, which teaches astronauts the use of robotic systems such as the Canadarm2.
Zero-G Mass Handling, which simulates zero-g mass characteristics of objects in a microgravity environment. (Remember that while objects in space may be weightless, an object’s mass still presents formidable handling and maneuvering challenges if large enough.)
Despite upward of two years of training astronauts undergo prior to space travel, it is impossible to carry out missions without extensive assistance from team members and subject matter experts on the ground. From solving engineering problems to proper operation of onboard experiments, significant effort is put into assisting astronauts in being able to safely and effectively carry out their mission objectives. To this end, NASA is constantly investigating methods with which to render this support beyond standard radio, video, and textual communications. One such investigation underway at the time this book was written is known as Project Sidekick.
Leveraging advances in augmenting display technologies such as those provided by Microsoft’s HoloLens (see Chapter 5), the goal of this project is to explore the use of an immersive procedural reference (that is, a manual or guidebook) and remote assistance system developed to provide the crew information and task support whenever it is needed. Based on the concept of a mixed reality setting (combining the physical environment and virtual objects), high-definition holograms displayed within the HoloLens device can be integrated into the astronaut’s real-world view within the Space Station, enabling new ways to access and exchange key information and guidance between personnel on orbit and individuals on the ground. Figure 17.15 shows the HoloLens device in pre-deployment testing.
Figure 17.15 This image shows NASA and Microsoft engineers testing Project Sidekick on NASA’s Weightless Wonder C9 jet. Project Sidekick will use Microsoft HoloLens to provide virtual aid and ground-based assistance to astronauts working on the International Space Station.
Credit: Image courtesy of NASA
At the time of this writing, the Sidekick system had two basic operating modes: Standalone (a procedural reference system) and Remote Expert:
Standalone Mode gives the astronaut access to an extensive preloaded manual with instructions, procedures, and checklists displayed as holograms that can be placed anywhere the astronaut finds it most convenient.
Remote Expert Mode is a video teleconference capability enabling real-time first person assistance with ground personnel. In operation, the crew member opens a holographic video screen where he can see the flight control team, system expert, or payload developer. Using the HoloLens’ built-in camera, the ground crew is able to see the astronaut’s work area and offer direct assistance in support of the task objectives.
The Sidekick project is only one of multiple applications for the HoloLens and other head-mounted augmenting displays within various manned and unmanned space programs. Another project referred to as OnSight currently places Earth-based scientists and engineers within a virtual re-creation of the operational environment of the Curiosity Rover on Mars. Using data sent back from the rover, 3D models are generated and displayed within the HoloLens device, enabling scientists to freely explore the area from a first-person perspective, plan new rover operations, and preview the results of past system tasking.