Aerospace and Defense
AR expert Steve Aukstakalnis presents case studies within which augmenting and immersive displays, spatial audio, and tactile and force feedback systems are used to leverage strengths of the human perceptual system in the control of complex machines such as jet aircraft to train astronauts and help refine skill sets and situational awareness of soldiers on the battlefield.
Applications for immersive and augmenting display technologies are widespread within the aerospace and defense communities of the United States and most other industrialized nations. From leveraging strengths of the human perceptual system in the control of complex machines such as jet aircraft, to training astronauts and helping refine skill sets and situational awareness of soldiers, virtual and augmented reality systems are having a solid impact on performance and cost efficiency. In this chapter we explore a number of such applications, detailing the benefits gained and some of the challenges still faced.
Flight Simulation and Training
Safely piloting an aircraft is an acquired talent. At the most basic level, it requires dozens of hours of actual flight time, plus classroom study, to develop, demonstrate, and test out on the legally recognized skill set and proficiency level necessary to become a licensed pilot. The more complex the aircraft, the greater the number of hours and specialized training necessary to learn how to safely and effectively handle the increasingly complicated systems.
This training methodology works sufficiently well up until the point that advanced skills are needed, such as flying in formation or aerial refueling. At that point, the training challenges and expense are magnified significantly to include the need for additional aircraft and crews, high-end simulators, and more.
Systems Technology, Inc. of Hawthorne, California, asked this question: Can we use an actual aircraft as a simulator and get the best of both worlds? The answer is yes. In collaboration with NASA’s Armstrong Flight Research Center at Edwards, California, and the National Test Pilot School in Mojave, California, engineers have developed an innovative combination virtual/augmented reality system known as Fused Reality that enables any aircraft to be used as a flying simulator.
As shown in Figure 17.1, the heart of the system is a fully immersive stereoscopic head-mounted display customized to include a centrally mounted video camera. Video signals from this camera are sent to a high-performance notebook computer, which itself is connected to the aircraft avionics data bus. Specialized software algorithms analyze the video signal and determine, quite literally, where the cockpit ends and the windscreen and windows begin. It is into these spaces (the windshield and windows) that computer-generated imagery is placed within the video signal returned to the display and presented to the user.
Figure 17.1 The Fused Reality head-mounted display shown in this image provides the user a combined view of the actual cockpit interior and instruments as well as computer-generated imagery beginning at the windows.
Credit: Image courtesy of NASA
The orientation of the user’s head (roll, pitch, and yaw) is monitored using IMUs built into the display unit. That information, as well as data from the avionics bus such as movement of aircraft controls, airspeed, and heading, is combined to generate and precisely register the computer-generated imagery.
The Fused Reality system provides two primary operating modes. The first, shown in Figure 17.2, provides a real-world view of the interior of the cockpit, but everything seen outside of the windscreen and windows is completely computer generated. Such a capability provides infinite flexibility in the creation of training scenarios. The user could actually be flying high above a barren desert but be presented with a detailed mountain scene within the display. Complicated approaches and precision runway or carrier landings can be practiced thousands of feet in the air. Or, as is depicted in Figure 17.2, complex aerial refueling operations and other formation flying scenarios can be practiced although there are no other aircraft for miles in any direction. It goes without saying that in this operating mode, having a safety pilot in the cockpit is highly recommended.
Figure 17.2 One operating mode of the Fused Reality system displays a completely computer-generated virtual environment beyond the edge of the pilot’s view of the physical control panel. In this snapshot of an aerial refueling simulation, the pilot attempts to connect a virtual receiver probe into a drogue receptacle extending from the wing of a computer-generated tanker.
Credit: Image courtesy of NASA
The second operating configuration, shown in Figure 17.3, is referred to as “stencil mode.” This configuration gives the user a real-world view of both the cockpit interior as well as the scene outside of the aircraft, but with computer-generated objects such as aircraft added into that outside view. Here again, the breadth of potential application scenarios is limitless. Pilots can practice and hone skills at a fraction of the cost, and without the danger, of traditional real-world training missions involving other aircraft and crews. If you collide with a virtual aircraft in these simulations, you simply reset the training application and start again (Merlin, 2015).
Figure 17.3 This image shows the Fused Reality system operating in stencil mode, within which a computer-generated virtual tanker is displayed over the real scene of the outside world.
Credit: Image courtesy of NASA
In addition, the Fused Reality system holds several other distinct advantages over traditional ground-based simulators used to develop and hone advanced flight skills. Even the most cutting-edge, state-of-the-art, full-motion flight simulators are unable to re-create the internal sensations of g-loading and its subtle vestibular effects, airframe buffet cues, or the feel of energy bleed. By taking the simulator aloft, these important perceptual cues are preserved.