Relative Feature Importance
Interpretable Machine Learning (IML) methodsare used to gain insight into the relevance of a feature ofinterest for the performance of a model. Commonly usedIML methods differ in whether they consider features ofinterest in isolation, e.g., Permutation Feature Importance(PFI), or in relation to all remaining feature variables, e.g.,Conditional Feature Importance (CFI). As such, the perturba-tion mechanisms inherent to PFI and CFI represent extremereference points. We introduce Relative Feature Importance(RFI), a generalization of PFI and CFI that allows for a morenuanced feature importance computation beyond the PFIversus CFI dichotomy. With RFI, the importance of a featurerelative to any other subset of features can be assessed,including variables that were not available at training time.We derive general interpretation rules for RFI based on adetailed theoretical analysis of the implications of relativefeature relevance, and demonstrate the method’s usefulnesson simulated examples

- König, Gunnar
- Molnar, Christoph
- Bischl, Bernd
- Grosse-Wentrup, Moritz

Category |
Paper in Conference Proceedings or in Workshop Proceedings (Paper) |
Event Title |
25th International Conference On Pattern Recognition |
Divisions |
Neuroinformatics |
Event Location |
Virtual Event |
Event Type |
Conference |
Event Dates |
10.-15.01.2021 |
Date |
10 January 2021 |
Export |
