en
Scientific article
Open access
English

Evaluation of post-hoc interpretability methods in time-series classification

Published inNature machine intelligence, vol. 5, no. 3, p. 250-260
Publication date2023-03-13
First online date2023-03-13
Abstract

Post-hoc interpretability methods are critical tools to explain neural-network results. Several post-hoc methods have emerged in recent years but they produce different results when applied to a given task, raising the question of which method is the most suitable to provide accurate post-hoc interpretability. To understand the performance of each method, quantitative evaluation of interpretability methods is essential; however, currently available frameworks have several drawbacks that hinder the adoption of post-hoc interpretability methods, especially in high-risk sectors. In this work we propose a framework with quantitative metrics to assess the performance of existing post-hoc interpretability methods, particularly in time-series classification. We show that several drawbacks identified in the literature are addressed, namely, the dependence on human judgement, retraining and the shift in the data distribution when occluding samples. We also design a synthetic dataset with known discriminative features and tunable complexity. The proposed methodology and quantitative metrics can be used to understand the reliability of interpretability methods results obtained in practical applications. In turn, they can be embedded within operational workflows in critical fields that require accurate interpretability results for, example, regulatory policies.

eng
Citation (ISO format)
TURBÉ, Hugues et al. Evaluation of post-hoc interpretability methods in time-series classification. In: Nature machine intelligence, 2023, vol. 5, n° 3, p. 250–260. doi: 10.1038/s42256-023-00620-w
Main files (1)
Article (Published version)
Identifiers
ISSN of the journal2522-5839
105views
100downloads

Technical informations

Creation03/27/2023 2:56:24 PM
First validation05/08/2023 1:35:19 PM
Update time05/08/2023 1:35:19 PM
Status update05/08/2023 1:35:19 PM
Last indexation05/06/2024 3:50:07 PM
All rights reserved by Archive ouverte UNIGE and the University of GenevaunigeBlack