Run-Time Assurance for Learning-Enabled Systems
D. Cofer, I. Amundson, R. Sattigeri, A. Passi, C. Boggs, E. Smith, L. Gilham, T. Byun, S. Rayadurgam
NASA Formal Methods Symposium, May 2020
There has been much publicity surrounding the use of machine learning technologies in self-driving cars and the challenges this presents for guaranteeing safety.
These technologies are also being investigated for use in manned and unmanned aircraft. However, systems that include "learning-enabled components" (LECs)
and their software implementations are not amenable to verification and certification using current methods.
We have produced a demonstration of a run-time assurance architecture based on a neural network aircraft taxiing application that shows how several
advanced technologies could be used to ensure safe operation. The demonstration system includes a safety architecture based on the ASTM F3269-17
standard for bounded behavior of complex systems, diverse run-time monitors of system safety, and formal synthesis of critical high-assurance components.
The enhanced system demonstrates the ability of the run-time assurance architecture to maintain system safety in the presence of defects in the underlying LEC.