论文标题

时间序列分类的基于实例的反事实说明

Instance-based Counterfactual Explanations for Time Series Classification

论文作者

Delaney, Eoin, Greene, Derek, Keane, Mark T.

论文摘要

近年来,在解释处理图像和表格数据的Black-Box AI系统的预测上,人们一直在迅速扩展的重点。但是,解释不透明AI系统处理时间序列数据的预测的关注程度要少得多。在本文中,我们推进了一种新颖的模型,基于病例的技术 - 本地指南 - 为时间序列分类器生成了反事实解释。给定查询时间序列,$ t_ {q} $为此,黑盒分类系统预测类,$ c $,反事实时间序列序列说明$ t_ {q} $如何变化,以便该系统预测替代类,$ c'$。提出的基于实例的技术通过强调和修改分类构成的时间序列的歧视区域来调整病例基础的现有反事实实例。两个比较实验的定量和定性结果表明,天然指南产生了合理的,近端,稀疏和多样的解释,这些解释比关键基准反事实方法产生的方法更好。

In recent years, there has been a rapidly expanding focus on explaining the predictions made by black-box AI systems that handle image and tabular data. However, considerably less attention has been paid to explaining the predictions of opaque AI systems handling time series data. In this paper, we advance a novel model-agnostic, case-based technique -- Native Guide -- that generates counterfactual explanations for time series classifiers. Given a query time series, $T_{q}$, for which a black-box classification system predicts class, $c$, a counterfactual time series explanation shows how $T_{q}$ could change, such that the system predicts an alternative class, $c'$. The proposed instance-based technique adapts existing counterfactual instances in the case-base by highlighting and modifying discriminative areas of the time series that underlie the classification. Quantitative and qualitative results from two comparative experiments indicate that Native Guide generates plausible, proximal, sparse and diverse explanations that are better than those produced by key benchmark counterfactual methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源