The term avionics was first used in late 1940s as a portmanteau conjoining ‘aviation’ and ‘electronics’ but since then, both aviation and electronics have undergone tremendous evolutionary changes. Modern military and civil aircraft have avionics as their quintessential defining characteristics and include systems such as navigation, communication, safety systems such as Traffic Collision Avoidance Systems (TCAS) and Enhanced Ground Proximity Warning Systems (EGPWS), surveillance systems like Automatic Dependent, Surveillance – Broadcast (ADS-B), satellite related services, SONARs, Electro-Optics, Electronic Warfare, Electronic Support Systems, Electronic Counter Measures, Target Acquisition, Weapons Control, Intelligence, Surveillance & Reconnaissance (ISR), flight controls, engine controls, Flight Data Recorders (FDRs), Cockpit Voice Recorders (CVRs) lighting arrangements, fuel management, radars including weather radar, performance monitors, flight management and displays, maintenance monitoring such as Health and Usage Management Systems (HUMS), Integrated Modular Avionics (IMA), Head Up Displays (HUD) and wearable displays, Electronic Flight Bags (EFBs), in flight connectivity and entertainment as also a large number of other electronically controlled or operated services ranging from a simple searchlight system on a helicopter to a rotating radar onboard an AWACS aircraft. The catalogue appears endless, with avionics a faithful and munificent handmaiden of the aviation industry.
As with almost every other industry, Artificial Intelligence (AI) is making significant inroads into aviation too. It is easy to see from the above enumeration of the salient avionics that the associated displays would perforce have to be located in the cockpit where the pilot sits and controls the aircraft from. In a development significant to avionics, AI has demonstrated the capability to perform the human role in the cockpit to levels consistently better than the best of humans. Aviation enthusiasts were offered a treat with the AlphaDogFight endeavour of the United States (US) Defence Advanced Research Projects Agency (DARPA) in 2020, wherein companies and institutions were invited to develop AI capable of dogfighting in an F-16. In the final round in August 2020, Heron Systems AI beat an F-16 pilot, a veteran of the Air Force’s Weapon Instructor Course, five times in a row. AlphaDogFight had a sensational appeal and highlighted the fact that AI holds out numerous other promises for the aviation industry.
Automation and Autonomy
Advances in avionics and Information Technology (IT) have made significant progress in reducing the need for human inputs and activities in the cockpit, especially in the context of repetitive tasks. Thus, a pilot is now much more focussed on monitoring, managing and programming cockpit-located control panels than on traditional tasks related to actual piloting of the aircraft. While the concept of a fully automated aircraft capable of a complete flight from take off to landing is more applicable to commercial flight, the high degree of automation empowered by avionics is also applicable to military flight inasmuch as the workload management of a pilot goes. There is thus a paradigm shift in the interface between the pilot and the system (avionics!) wherein the pilot becomes a mission manager while avionics actually perform control and actions within the envelope defined by the pilot. What this means is that avionics is becoming increasingly complex and the speed of their computational prowess is becoming a critical element of their in-flight capability to approach and manage all types of situations including unexpected events. AI and its subset Machine Learning (ML) are increasingly being pressed into service to help avionics manage this workload.
The growing complexity has reached a critical threshold – of the number of functionalities which can be included in the current cockpit display designs in a manner that permits a pilot or a set of pilots to manage. A decade ago, avionics Original Equipment Manufacturers (OEMs) started experimenting with and developing single touchscreen cockpit displays with built-in flexibility to organise and present information to the pilot permitting him hands-on interaction. The One Display for a Cockpit Interactive Solution (ODICIS) developed by Thales was one such concept that has now been refined into Avionics 2020 which targets both civil and military applications across all market segments including commercial air transport, business jets, helicopters, and military fighters and trainers. It has been especially designed to ease flight operations by reducing or optimising pilot workload, especially during periods of high activity levels, i.e. take-off, approach and landing. It goes beyond Windows, Icons, Menus and Pointers (WIMP) to offer an intuitive, integrated experience on large-format interactive touch screens in conjunction with a wearable display.
While AI was part of the solution to this challenging competence level of avionics, it is the enabler for the ongoing move from automation to autonomy in flight with the ultimate objective being the ability of avionics to perform intricate tasks in complex environments without the need for human intervention. AI is rendering systems that think and make decisions like a human being. Fifth and sixth generation combat aircraft designs under development are increasingly leaning towards optionally manned versions. The implication is AI-assisted avionics permitting unmanned operations. The F-16 already has an unmanned version and the next generation B-21 bomber is also expected to have one. Smaller Unmanned Aerial Vehicles (UAVs) and Unmanned Combat Aerial Vehicles (UCAVs), designed ab initio to fly autonomously, have avionics which, in conjunction with AI, provide for flight path control, sensors management, weapon delivery and tactical manoeuvering in a hostile environment. Projects are on to employ unmanned swarms as well as mixed manned and unmanned formations. Avionics on the manned and the unmanned elements would be aided by AI in tactical decision-making. Skyborg and Loyal Wingman are two illustrations of this Manned Unmanned Teaming (MUM-T) concept employing AI-empowered avionics.
F-35, a fifth-generation combat aircraft, is at the leading edge of the Loyal Wingman concept but already a sixth-generation of combat aircraft is on the anvil with as many as nine nations working on sixth generation designs. While quality of stealth of sixth-generation combat aircraft is expected to be of a better texture than that of fifth-generation aircraft, the distinguishing characteristic of sixth-generation combat aircraft is expected to be an exceedingly high incorporation of AI into their avionics suites through high speed connectivity to data links with entities in the airspace around them and on surface (land or sea). Some of them could be optionally piloted with AI replacing the pilot on certain or all missions. Some analysts predict that the optionally piloted design may well be the standard design in two decades. Meanwhile, a new partnership between ANSYS, a simulation software company, and Airbus Defence and Space is developing a new AI design tool to create the embedded flight control software for Europe’s Future Combat Air System (FCAS), an air combat development programme involving France, Germany and Spain of a system of fully automated remote air platforms and sixth generation fighters that will replace the current fourth generation Eurofighter and Rafale jets operated by European nations which have decided to omit the fifth generation altogether and hop directly to the sixth. Dassault and Airbus are the leading manufacturers for the FCAS programme which will include avionics using MUM-T, surveillance data fusion and a new generation of satellites to provide networked sensing and data communications.
Software solutions are being sought that will allow motion planning and trajectory data to avionics systems so that they can recognise events in real time, make routine or non-routine corrections and take emergency action. In the case of civil aircraft, there is at least one certified system. Garmin® Autoland achieved US Federal Aviation Administration (FAA) certification for general aviation aircraft in May 2020) that, in the event of pilot incapacitation, has avionics on board that, with embedded AI, can take over control of the aircraft, navigate it to an appropriate diversion, carry out a standard arrival and execute a safe landing. While avionics provide the navigation, communication, approach and auto-landing inputs, AI uses those inputs to control the aircraft just as a pilot would, including two way communications through onboard avionics with air traffic controllers on the ground.
Airbus and Boeing, the two leaders in aircraft manufacturing industry, forecast future flights without pilots and are working on pilotless technology which will first produce a single pilot operating commercial flights with a pilotless one further away in time. The enabler technology is AI which will monitor every phase of the flight and ensure absolute compliance and flight safety. AI compares the stored manufacturer’s exact flight manual performance and handling criteria, and the current flight status. It also takes into account the digitally transferred external situational data and criteria and monitors the flight. In the first phase, the pilot still has overall command, but the aircraft shares responsibility, while in the pilotless phase, AI will have primary authority for maintaining 100 percent error-free speeds and altitudes, stall-safe, angle-of-attack and a trajectory safe from collisions with other aircraft or ground.
In another DARPA project called Aircrew Labour In-Cockpit Automation System (ALIAS), Aurora Flight Sciences has demonstrated its ability to utilise the existing Boeing 737 auto-landing system to autonomously land the aircraft safely in the event of pilot incapacitation. On a simulator at the US Transportation Department’s John A Volpe National Transportation Systems Centre in Cambridge, Massachusetts, Aurora has also demonstrated ALIAS numerous times on aircraft in flight including a fully automated landing at a simulated site at 3,000 feet in altitude. The ALIAS system, sometimes called the ‘robotic co-pilot’, is thus an impressive portent of what AI and avionics hold out for aviation conjointly.
Some Major Initiatives
Safety considerations, performance enhancement and costs versus benefits factor essentially drive avionics R&D. Some of the areas where AI is impinging on avionics are discussed here.
AI is already working in avionics providing military combat aircraft with best interception profile, weapon guidance and trajectory, flight departure prevention and ISR. The F-35 has substantial AI assistance in its avionics systems. In June 2020, Lockheed Martin Skunk Works successfully demonstrated an autonomous ISR system to enhance operational effectiveness for a combat aircraft in denied communications environments. Integrated into an F-16 through a Lockheed Martin-developed pod, the AI-powered avionics system was able to detect and identify the location of the target, automatically route to the target, and capture an image to confirm the target in a simulated, denied communications environment thus keeping the aircraft safe while achieving its mission.
Another avionics area fructified by AI is target recognition. Rafael Advanced Defence Systems Ltd. has demonstrated a new Automatic Target Recognition (ATR) capability for its SPICE-250 air-to-ground, stand-off, autonomous weapon system which can operate in GPS-denied environments using Inertial Navigation System (INS) for initial navigation and ATR mode in the target area for detection and recognition of its individually assigned target autonomously. AI and Deep Learning technologies provide the breakthrough enabling the SPICE-250 to effectively learn the characteristics of specific targets in advance of the strike.
AI intervention into avionics is applicable to emerging designs as well as retro-fits for existing aircraft; typical illustrations are US Air Force F-35s, F-15s and B-2s. AI facilitates rapid database access, organising of information and performance of high volume procedural functions and AI algorithms are increasingly able to scan, view and organise targeting, ISR and sensor input such as navigation information, radar warning information, images or video. The F-35 avionics use AI iterations to help acquire, organise and present information to the pilot on a single, interactive screen without much human intervention, thus easing the cognitive burden on pilots. The F-35s ‘sensor fusion’ involves consolidating targeting, navigation and sensor information for pilots. Its Autonomic Logistics Information System (ALIS) involves AI applications wherein computers make assessments, go through checklists, organise information and make decisions by themselves without needing human intervention. It is able to radio back information about engine health or other avionics thus making the aircraft’s logistics tail more automated.
The B-2, an aircraft built in the 1980s, now has a ‘glass cockpit’ (electronic/digital flight instrument displays, typically large LCD screens, rather than the traditional style of analogue dials and gauges) and a new flight management control processor which increases the performance of the avionics and on-board computer systems by an estimated 1,000 times. A new processor helps automated navigation programmes and expedites the B-2s “fly-by-wire” technology, enabling the aircraft’s avionics, radar, sensors and communications technologies to better identify and attack enemy targets. The sensor-to-shooter time is greatly reduced, allowing the B-2 to launch weapons much more effectively, thereby reducing its exposure to enemy attacks. AI is being injected into aircraft incrementally and avionics is the area that receives the maximum benefit of that introduction.
AI has been part of auto pilots and Full Authority Digital Engine Control (FADEC) systems for years now. While the former is an avionics system to manipulate aircraft controls instead of the pilot manually doing it in repetitive, tiresome physical and mental activity, the latter is an AI-managed aircraft ignition and engine control system used in modern commercial and military aircraft to control all aspects of engine performance digitally, using onboard avionics. Indeed, the aircraft auto pilot and the FADEC preceded autonomous car technology. Notably, a significant facet of AI relates to handling and using data. AI performs a wide range of functions not restricted to conventional conceptions associated with computers. Algorithms are designed to instantaneously access immense amount of data, organise it, compare information and execute automated procedural and analytical functions for human interface. The speed at which AI can utilise high volumes of data is vital for the real time functioning of avionics, including for weapon systems. In the context of ISR, for example, AI algorithms can scan, view and organise images or video inputs so as to instantaneously identify targets or classify points of combat relevance. Although these tasks can be performed manually also, AI enabled technology performs procedural functions exponentially faster than humans can, thus shortening the crucial decision-making timeframe for combat decision makers enormously.
In April 2020, Daedalean, a company building autonomous piloting software systems for civil aircraft and European Union Aviation Safety Agency (EASA) jointly published a report entitled, “Concepts of Design Assurance for Neural Networks” on a joint project investigating the challenges and concerns of using Neural Networks (NN) in aviation. The report shows how AI is ready to advance to tasks like including safety critical flight controls and navigation systems. Daedalean has a partnership with business and general aviation aircraft avionics maker Avidyne to develop what is described as the first ever ML-based avionics system. Its design incorporates several cameras and a powerful computation unit, interfacing with other aircraft electronics, capable of detecting any airborne or ground-based hazard. The avionics would now comprise optimised classic cockpit instruments such as collision avoidance systems, flight controllers and autopilots as also be capable of processing visual images similar to human-like situational awareness, thus optimising flight operations and freeing pilots for safety-critical supervision. Needless to say, AI is the backbone of the supported avionics.
As the ingress of AI into avionics is inevitable and inexorable, the US Air Force plans to spend nearly $100 million over the next five years to shrink the Size, Weight and Power (SWaP) consumption of AI and ML embedded computing avionics for a variety of military aircraft. The objectives are greater sophistication, autonomy, intelligence and assurance for Command, Control, Communications, Computers, Intelligence, Surveillance and Reconnaissance (C4ISR) applications and SWaP-constrained aircraft with special emphasis on advanced computing architectures, algorithms and applications. A major focus involves brain-inspired processing called neuromorphic computing which involves unconventional circuits and architectures enabled by emerging nano-technology devices such as memristors and nano-photonics. Unconventional computing architectures are necessary for new capabilities in pattern recognition, event reasoning, decision making, adaptive learning and autonomous tasking for energy-efficient agile avionics. Technologies and applications will include AI and ML for big data analytics of multi-source sensor data, data fusion algorithms for situation understanding and sense-making, also autonomous decision making techniques.
Computer algorithms, enhanced through ML, can enable systems to draw upon vast volumes of historical data as a way to expedite analysis of key mechanical indicators. Avionics systems can then take unstructured information from maintenance manuals, reports, safety materials and history information and use AI to analyse data and draw significant conclusions for predictive maintenance. Onboard avionics can be monitored and analysed using AI-enabled computers to discern when repairs or replacement parts are needed.
AI is being superimposed on avionics to provide more intuitive interfaces, more touch screens, capable new sensors and improved automation aspiring to attain total autonomy in operations. The advances are rapid and portend an eventually pilotless cockpit with avionics displays surviving briefly in optionally piloted versions. This trend makes sense on two counts. Firstly, the penetration of AI into aviation – as in the case of all industries – is inevitable due to the increasing obtainability of AI technologies. Secondly, the economics of doing away with expensive (to train and retain) pilots and utilising AI powered avionics to do the pilot’s job makes that option inescapable.
As future avionics systems integrate multiple functions to enhance performance, simplify maintenance, lower costs and enhance safety, they will be designed increasingly to think and make decisions like human brains – with the assistance of AI, of course. An interesting issue with AI taking avionics closer to autonomous operations is whether the culpability in case of air accidents will shift from pilots (human error) to the avionics OEM. This question brings us to another issue – that of certification which, in the realm of aviation, is stringent, rule-bound and leans heavily towards safety. It would be natural for human beings to have a cognitive bias antagonistic to trusting AI and so certification for AI in avionics is bound to be challenging. Perhaps the approach to this problem would be a strategy to convince the certification authorities that the risk of introducing AI into avionics is manageable.