# Supporting Self-adaptation via Quantitative Verification and Sensitivity Analysis at Run Time

### Abstract

Modern software-intensive systems often interact with an environment where changes occur unpredictably. The occurrence of changes may jeopardize a system’s ability to meet the desired requirements and, as a consequence, it is desirable to design software in a way that it can self-adapt to the occurrence of changes with limited, or even without, human intervention.

By bringing software models and model checking to run time, it is possible to support automatic reasoning about changes: once a change is detected, it is possible to predict if failures may occur and enable appropriate counter-actions. However, existing mainstream model checking techniques and tools were not conceived for run-time usage; hence they hardly meet the constraints imposed by on-the-fly analysis in terms of execution time and memory usage.

This paper addresses this issue and focuses on continuous satisfaction of non-functional requirements (e.g., reliability). Its main contribution is the description of a mathematical framework for run-time efficient probabilistic model checking. Our approach statically generates a set of verification conditions that can be efficiently evaluated at run time as soon as changes occur. The proposed approach also supports a sensitivity analysis mechanism, which enables reasoning about the effects of changes and can drive effective adaptation strategies.

Authors: Antonio Filieri, Giordano Tamburrelli, and Carlo Ghezzi

Paper in pdf

### Supplementary material

1