论文标题

使用跨模式变压器进行可解释的睡眠阶段分类

Toward Interpretable Sleep Stage Classification Using Cross-Modal Transformers

论文作者

Pradeepkumar, Jathurshan, Anandakumar, Mithunjha, Kugathasan, Vinith, Suntharalingham, Dhinesh, Kappel, Simon L., De Silva, Anjula C., Edussooriya, Chamira U. S.

论文摘要

准确的睡眠阶段分类对于睡眠健康评估很重要。近年来,已经开发了几种基于机器的睡眠阶段算法,尤其是基于深度学习的算法在与人类注释的情况下达到了性能。尽管性能提高,但最深入学习算法的局限性是它们的黑盒行为,它限制了它们在临床环境中的使用。在这里,我们提出了一个跨模式变压器,这是一种基于变压器的睡眠阶段分类的方法。所提出的跨模式变压器由一种新型的跨模式变压器编码器结构以及用于自动表示学习的多尺度的一维卷积神经网络组成。我们的方法的表现优于最先进的方法,并通过利用注意模块的可解释性方面来消除深度学习模型的黑盒行为。此外,与最先进的方法相比,我们的方法可大大减少参数和训练时间。我们的代码可从https://github.com/jathurshan0330/cross-modal-transformer获得。可以在https://bit.ly/cross_modal_transformer_demo上找到我们作品的演示。

Accurate sleep stage classification is significant for sleep health assessment. In recent years, several machine-learning based sleep staging algorithms have been developed , and in particular, deep-learning based algorithms have achieved performance on par with human annotation. Despite improved performance, a limitation of most deep-learning based algorithms is their black-box behavior, which have limited their use in clinical settings. Here, we propose a cross-modal transformer, which is a transformer-based method for sleep stage classification. The proposed cross-modal transformer consists of a novel cross-modal transformer encoder architecture along with a multi-scale one-dimensional convolutional neural network for automatic representation learning. Our method outperforms the state-of-the-art methods and eliminates the black-box behavior of deep-learning models by utilizing the interpretability aspect of the attention modules. Furthermore, our method provides considerable reductions in the number of parameters and training time compared to the state-of-the-art methods. Our code is available at https://github.com/Jathurshan0330/Cross-Modal-Transformer. A demo of our work can be found at https://bit.ly/Cross_modal_transformer_demo.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源