Advanced Cockpit Avionics
Aircraft have evolved to become some of the most complex and consequential machines created by man. As their size and capability have steadily increased over the years, so too have the challenges involved in their safe operation and effective utilization. In the following sections we look at some of these challenges and the solutions found through the application of virtual and augmented reality-enabling technologies.
The cockpit designs of military aircraft, and in particular fighter jets, have changed significantly over the past several decades. Previously, cockpits were filled with dozens of switches, buttons, and other manual controls, in addition to numerous highly coded dials and gauges providing information on aircraft systems, navigation, weapons status, sensors, and more. Often this information was presented in alphanumeric form (a combination of alphabetic and numeric characters), the totality of which was intended to communicate critical information and help a pilot form a mental image about what was happening outside of the aircraft. This complex mental processing task was over and above the actual job of operating the aircraft and solving problems related to the geometry of flight, aerial combat maneuvering, and tactical engagement. The great challenge with these early designs was that the pilot was forced to spend a significant amount of time with his attention focused inside the cockpit reading dials and gauges or interpreting grainy sensor images instead of looking outside the aircraft where targets and threats were located. The net result was a frequent sense of information overload, high stress, and a loss of situational awareness.
Movement to multifunction displays (small screens surrounded by buttons) within which this same information about aircraft systems, navigation, weapons status, sensors, and so on is logically organized into multiple pages, rather than everything always being visible, helped the information processing task immensely. Similarly, the widespread adoption of HUDs, or heads-up displays—a transparent screen, or combiner, typically mounted on the cockpit dash at eye level—and the conversion of some cockpit avionics information from letters and numbers to a symbolic representation further eased this burden. But here, the major limiting factor is that the pilot must be looking straight ahead to see this information.
At this point, the next logical step in cockpit design was the movement of the information display from the HUD unit to optical elements mounted within, or directly onto the visor of, the pilot’s helmet. Such systems allow critical information to be displayed to a pilot regardless of where his head is pointing, further maximizing the amount of time a pilot spends looking outside of the aircraft instead of inside of the cockpit. In many regards, helmet-mounted displays can be considered the first widely deployed augmented reality systems.
To date, dozens of different helmet-mounted displays have been developed for fixed- and rotary-wing aircraft around the world, each of which has served at least one of the following purposes:
Display targeting, navigation, and aircraft performance data to the pilot.
Direct high off-boresight (HOBS) air-to-air and air-to-ground weapons.
Slave onboard sensors such as radar and FLIR.
Display sensor video.
Figure 17.10 shows two modern, currently deployed helmet-mounted displays in use within fixed- and rotary-wing aircraft.
Figure 17.10 This image shows two head-mounted displays currently in use within U.S military aircraft. On the left is the GENTEX Scorpion Helmet-Mounted Cueing System in use within the A-10 Thunderbolt and the Air National Guard/Air Force Reserve F-16 Block 30/32 Viper aircraft. On the right is the Thales TopOwl Helmet-Mounted Sight and Display that is operational in five major helicopter programs across 16 countries, including the Cobra AH-1Z and Huey UH-1Y.
Credit: Photos courtesy of Thales—a global technology leader for aerospace, transport, defense and security markets. www.thalesgroup.com
It is important to point out that within this application setting, extremely wide FOV displays are actually considered a hindrance and potentially dangerous. The goal is to provide the pilot with essential information from airborne weapons and sensor targeting suites without cluttering the visual field, which could have disastrous consequences.
F-35 Joint Strike Fighter Helmet-Mounted Display System
The most advanced helmet-mounted display system (HMDS) currently in use and representative of the absolute state-of-the-art in capabilities is that which is deployed with the Lockheed Martin F-35 Lightning II Joint Strike Fighter.
Built into the lightweight helmet shown in Figure 17.11 is a 30° × 40° binocular FOV, high-brightness, high-resolution display with integrated digital night vision. A fully integrated day and night flight weapons and sensor data visualization solution, pilots in aircraft equipped with the system have immense capabilities, not the least of which is to aim weapons simply by looking at a target. For night missions, in addition to the cueing of sensors and weapons, the system projects the night vision scene directly onto the interior of the visor, eliminating the need for separate night-vision goggles.
Figure 17.11 This image shows an oblique view of the F-35A Lightning II helmet-mounted display, which provides pilots unparalleled situational awareness, with real-time imagery from six sensor packages mounted around the exterior of the aircraft.
Credit: Image courtesy of DoD
One of the most innovative features of this system is the ability to display a spherical 360° degree view of the world outside of the cockpit as if the airframe were not present, including below and to the sides of the aircraft. Sometimes referred to as a “glass cockpit,” this capability is enabled via an electro-optical Distributed Aperture System, which consists of six high-resolution infrared sensors mounted around the F-35 airframe. The overlapping FOV of the six sensors are blended to provide unobstructed spherical (4π steradian) imagery as well as missile and aircraft detection and countermeasure cueing. Figure 17.12 provides an example of the view a pilot would receive within the HMD by combining the infrared scene along with avionics and sensor data.
Figure 17.12 The F-35’s Distributed Aperture System (DAS) fuses real-time imagery from externally mounted sensors with data provided by onboard avionics systems.
Credit: Image courtesy of S. Aukstakalnis
Pilots within the commercial aviation sector face many of the same information availability and cognitive processing challenges as their military counterparts, but without the added burden of combat maneuvering, weapons targeting and deployment, and so on. In particular, challenges for the commercial aviation sector come in the form of takeoffs and landings in low-visibility conditions such as heavy fog and storms, to potential runway incursions when taxiing under the same conditions. These problems are obviously not new, and although aircraft manufacturers and avionics suppliers have integrated heads-up display technologies into a variety of aircraft, the challenge of the pilot only being able to see the information while facing forward remains. As such, several manufacturers are now introducing head-mounted displays for commercial aircraft as an option to their cockpit avionics suites. One such company is Elbit Systems, Ltd of Haifa, Israel.
Skylens is the wearable display component of the Elbit Clearvision Enhanced Flight Vision System (EFVS). Clearvision uses multispectral sensors mounted outside of the aircraft to capture terrain and airport lights in darkness and reduced visibility. This data is fused with topology from a global terrain database as well as conformal flight guidance symbology and, typically, projected onto a fold-down HUD providing a high-fidelity view of the outside world even when actual visibility is limited or zero.
The Skylens component provides the pilot with the same information that would normally be displayed in the HUD, but in a head-mounted device. By tracking the pilot’s head movements, critical information and symbology can be stabilized and correlated to the real world as the pilot scans the scene, improving the operator’s ability to execute precision and nonprecision approaches and reducing the risks of Controlled Flight into Terrain (CFIT) accidents.
The Skylens system itself is a monocular, off-the-visor display, the image source for which is a 1280 × 1024 monochromatic (green) microdisplay with an effective area of 1024 × 1024 in a circular area. The system uses triple redundant optical sensors for head tracking.
The general aviation sector also faces the same information availability and mental processing challenges as their commercial and military counterparts. Although cockpit avionics in general aviation aircraft have made significant advances over the past decade in enabling the visualization of terrain, navigational aids, hazards, weather, and traffic awareness information on state-of-the-art multifunction displays, here again, accessing the information still requires the pilot to focus attention inside of the cockpit and off the skies. Further, current display technologies still require pilots to mentally convert this complex assortment of 2D information into a 3D mental image of the environment surrounding the aircraft, dramatically increasing the workload and stress levels.
But unlike the military and commercial sectors, general aviation enthusiasts have, until recently, not had viable (or affordable) solutions on the horizon. Fortunately, the confluence of advances in augmented reality software and display hardware, as well as seemingly unrelated initiatives with U.S. and international aviation authorities, is resulting in the development of some amazing alternative information display possibilities for general aviation participants.
In a nutshell, if you want to operate an aircraft in designated U.S. airspaces (Class A, B, C, and parts of D and E) after January 1st, 2020, federal regulations require that your aircraft be equipped with what is known as an ADS-B (Automatic Dependent Surveillance-Broadcast) transponder. This small piece of electronics gear, simply referred to as ADS-B OUT, transmits information about your plane’s altitude, airspeed, and GPS-derived location to ground stations, as well as to other aircraft in your vicinity equipped with ADS-B IN receivers. Air traffic controllers and properly equipped aircraft use this information to “see” participating aircraft in real time, with the ultimate goal of improving air traffic management and safety. Typically, the ADS data is shown on a 2D multifunction display within the cockpit.
Aero Glass, Inc. of San Diego, California, and Budapest, Hungary, is one of several companies developing a means through which to display ADS-B and other instrument data within augmenting head-mounted displays such as the Epson Moverio and Osterhut Design Group (ODG) R-7 (both of which are detailed in Chapter 5, “Augmenting Displays”). As shown in Figure 17.13, the visual effect is the overlay of this information in graphics and symbolic form onto the user’s real-world view regardless of the position and orientation of the pilot’s head or the aircraft.
Figure 17.13 This image depicts a sample of the aeronautical information that can be displayed using Aero Glass software, an augmented reality head-mounted display and sensor to track position and orientation of the pilot’s head.
Credit: Image courtesy of Aero Glass Corporation
As depicted in this image, several of the raw data types shown that would normally be represented in 2D on a multifunction display or map/chart in the pilot’s lap actually represent static as well as time-varying 3D phenomena, such as controlled or restricted volumes of airspace, multiple airways, and the position and movement of nearby aircraft. By displaying information in a manner that depicts the actual spatial characteristics of the data, as well as its precise position, the pilot is given a greatly increased level of situational awareness and visual understanding about the real environment through which one is flying.
The Aero Glass system consists of a software suite that combines ADS-B and other avionics data, information from sensors measuring the pilot’s head position and orientation, and the actual display device.