论文标题

基于脑电图的癫痫发作预测使用时间多通道变压器

EEG-Based Epileptic Seizure Prediction Using Temporal Multi-Channel Transformers

论文作者

Godoy, Ricardo V., Reis, Tharik J. S., Polegato, Paulo H., Lahr, Gustavo J. G., Saute, Ricardo L., Nakano, Frederico N., Machado, Helio R., Sakamoto, Americo C., Becker, Marcelo, Caurin, Glauco A. P.

论文摘要

癫痫是最常见的神经系统疾病之一,其特征是短暂和无端事件,称为癫痫发作。脑电图(EEG)是一种用于执行诊断和监测癫痫的辅助方法。鉴于癫痫发作的意外性质,其预测将改善患者护理,优化生活质量和癫痫的治疗。预测癫痫发作意味着在患有癫痫患者的患者中鉴定出脑电图的两个不同状态:前骨和间隔。在本文中,我们开发了两个深度学习模型,称为时间多通道变压器(TMC-T)和Vision Transformer(TMC-VIT),即用于多通道时间信号的基于变压器的架构的改编。此外,我们访问了选择不同的前持续时间的影响,因为它的长度不是专家之间的共识,还评估了样本量如何使每个模型受益。将我们的模型与完全连接,卷积和经常性网络进行比较。该算法经过特定于患者的训练,并根据CHB-MIT数据库的RAW EEG信号进行了评估。实验结果和统计验证表明,我们的TMC-VIT模型超过了CNN体系结构,这是癫痫发作预测的最先进。

Epilepsy is one of the most common neurological diseases, characterized by transient and unprovoked events called epileptic seizures. Electroencephalogram (EEG) is an auxiliary method used to perform both the diagnosis and the monitoring of epilepsy. Given the unexpected nature of an epileptic seizure, its prediction would improve patient care, optimizing the quality of life and the treatment of epilepsy. Predicting an epileptic seizure implies the identification of two distinct states of EEG in a patient with epilepsy: the preictal and the interictal. In this paper, we developed two deep learning models called Temporal Multi-Channel Transformer (TMC-T) and Vision Transformer (TMC-ViT), adaptations of Transformer-based architectures for multi-channel temporal signals. Moreover, we accessed the impact of choosing different preictal duration, since its length is not a consensus among experts, and also evaluated how the sample size benefits each model. Our models are compared with fully connected, convolutional, and recurrent networks. The algorithms were patient-specific trained and evaluated on raw EEG signals from the CHB-MIT database. Experimental results and statistical validation demonstrated that our TMC-ViT model surpassed the CNN architecture, state-of-the-art in seizure prediction.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源