Enhancing Trust in Machine Learning Systems by Formal Methods: with an application to a meteorological problem
With the deployment of applications based on machine learning techniques the need for understandable explanations of these systems’ results becomes evi-dent. This paper clarifies the concept of an “explanation“: the main goal of an explanation is to build trust in the recipient of the explanation. This can only be achieved by creating an understanding of the results of the AI systems in terms of the users’ domain knowledge. In contrast to most of the approaches found in the literature, which base the explanation of the AI system’s results on the model provided by the machine learning algorithm, this paper tries to find an explanation in the specific expert knowledge of the system’s users. The domain knowledge is defined as a formal model derived from a set of if-then-rules provided by experts. The result from the AI system is represented as a proposition in a temporal logic. Now we attempt to formally prove this propo-sition within the domain model. We use model checking algorithms and tools for this purpose. If the proof is successful, the result of the AI system is con-sistent with the model of the domain knowledge. The model contains the rules it is based on and hence the path representing the proof can be translated back to the rules: this explains, why the proposition is consistent with the domain knowledge. The paper describes the application of this approach to a real world example from meteorology, the short-term forecasting of cloud coverage for particular locations.
Top- Tavolato, Paul
- Tavolato-Wötzl, Christina
Category |
Paper in Conference Proceedings or in Workshop Proceedings (Paper) |
Event Title |
Machine Learning and Knowledge Extraction |
Divisions |
Security and Privacy |
Subjects |
Angewandte Informatik |
Event Location |
Benevento, Italy |
Event Type |
Conference |
Event Dates |
29 Aug - 01 Sep 2023 |
Series Name |
Lecture Notes in Computer Science |
Publisher |
Springer |
Page Range |
pp. 170-187 |
Date |
2023 |
Export |