Explainable Verification
Abstract
The cost of software failure is staggering. A rigorous approach to improving software quality and reducing bugs is sound program analysis based on abstract interpretation. In this project, we will push the state of the art in the abstract interpretation of multi-threaded programs written in complex mainstream programming languages, such as C. The goal of this project is to increase trust in the verification process through explainability. To trust the analysis specification, we will explain its mathematical foundations and essence; to increase trust in the implementation, we will enable tools to explain their results to other tools for validation; to gain the trust of the end users, we will enable tools to also explain their reasoning in human-readable form. To achieve these goals, the project will combine research in sound program analysis with compositional methods from programming language and category theory.
Related Papers
Safety and quality of high-risk plant-based foods and meat alternatives
Roasto, Mati
The Circular Schools – Empowering Secondary Education Students for a Green Future through Circularity Thinking Strategies
Voronova, Viktoria
Developing Estonian startup ecosystem and startup incubation programs: Part 1 - Developing the deep-tech startup ecosystem.
Lööve, Triinu