event-icon
Description
Software designs for perception, decision-making, planning and control in safety-critical cyber-physical systems (CPS) increasingly use components based on deep learning. It is well-known that the behavior of a deep learning-enabled component (LEC) may be difficult to predict when its input data is sufficiently different from the data used to train the component. In safety-critical CPS, this could mean that the LEC makes decisions that lead to unsafe actions for the overall system. To have assurance about the system behavior, it is critical that we formally characterize the assumptions on the inputs to the LEC and guarantees that the LEC provides under these assumptions. In this paper, we formulate a framework based on envelope-invariant pairs for the LEC a that allows us to compositionally reason about overall system safety. We demonstrate our technique on a prototype of an advanced driver assist system in a modern car.
Tags