Scientific article
OA Policy
English

Evaluation of post-hoc interpretability methods in time-series classification

Published inNature machine intelligence, vol. 5, no. 3, p. 250-260
Publication date2023-03-13
First online date2023-03-13
Abstract

Post-hoc interpretability methods are critical tools to explain neural-network results. Several post-hoc methods have emerged in recent years but they produce different results when applied to a given task, raising the question of which method is the most suitable to provide accurate post-hoc interpretability. To understand the performance of each method, quantitative evaluation of interpretability methods is essential; however, currently available frameworks have several drawbacks that hinder the adoption of post-hoc interpretability methods, especially in high-risk sectors. In this work we propose a framework with quantitative metrics to assess the performance of existing post-hoc interpretability methods, particularly in time-series classification. We show that several drawbacks identified in the literature are addressed, namely, the dependence on human judgement, retraining and the shift in the data distribution when occluding samples. We also design a synthetic dataset with known discriminative features and tunable complexity. The proposed methodology and quantitative metrics can be used to understand the reliability of interpretability methods results obtained in practical applications. In turn, they can be embedded within operational workflows in critical fields that require accurate interpretability results for, example, regulatory policies.

Citation (ISO format)
TURBÉ, Hugues et al. Evaluation of post-hoc interpretability methods in time-series classification. In: Nature machine intelligence, 2023, vol. 5, n° 3, p. 250–260. doi: 10.1038/s42256-023-00620-w
Main files (1)
Article (Published version)
Identifiers
Additional URL for this publicationhttps://www.nature.com/articles/s42256-023-00620-w
Journal ISSN2522-5839
225views
148downloads

Technical informations

Creation27/03/2023 16:56:24
First validation08/05/2023 15:35:19
Update time08/05/2023 15:35:19
Status update08/05/2023 15:35:19
Last indexation01/11/2024 05:58:24
All rights reserved by Archive ouverte UNIGE and the University of GenevaunigeBlack