Advanced capabilities planned for the next generation of autonomous unmanned vehicles will be based on non-traditional software components containing complex machine learning algorithms. These systems will incorporate adaptive and intelligent control algorithms that provide enhanced safety, autonomy, and high-level decision-making functions currently performed by human operators, as well as robustness in the presence of failures, adverse conditions, and unexpected situations.
Focusing on the aviation context, there are serious barriers to the deployment of autonomous aircraft, both in the military and in the National Airspace System (NAS). Current civil aviation certification processes are based on the idea that the correct behavior of a system or a component must be completely specified and verified prior to operation. Military aircraft acquisition processes make reference to civil certification processes and limitations. These barriers will delay or prevent the deployment of crucial safety functions and new capabilities that could be of great value in both military and civilian applications.
Collins Aerospace, Stanford University, Kestrel Institute, and University of Minnesota have teamed on DARPA's Assured Autonomy program to create new technologies for ensuring the safety of LECs in autonomous vehicle applications. These technologies include:
- Architecture-driven assurance based on compositional reasoning about formally specified contracts for system components, including LECs
- New formal verification methods that enable precise reasoning about LECs
- Machine-checkable architecture-based assurance cases that encompass both design-time and run-time behaviors
- Synthesis of run-time monitors for LECs
- Safety-aware constraints on run-time adaptation of LECs
- Architectural mitigation and enforcement techniques to guarantee safety in the presence of learning and adaptation
- Assurance-optimized system test generation informed by analysis results
Models and code for the project are available on github.
The phase 1 project demonstration implemented a run-time assurance architecture based on the Boeing Deep Taxi autonomy platform. The videos below show the impact of a learning component with a simulated defect, and the effectiveness of the run-time monitoring system to intervene and maintain system safety.
Defective LEC causes aircraft to depart from the runway:
Run-time assurance architecture maintains system safety in the presence of a defective LEC (aircraft stays on the runway):
The phase 3 demonstration is based on Boeing's conlict avoidance trajectory generation neural network. We used a run-time assurance architecture to monitor the trajectory produced and predict whether there would be any violation of the "remain well clear" criteria relative to the intruder aircraft. This Boeing video describes flight demonstrations with two aircraft flying a variety of encounter geometries to test the performance and safety of the solution.