论文标题
对比对比的ECG表示学习的扩展分析
Analysis of Augmentations for Contrastive ECG Representation Learning
论文作者
论文摘要
本文系统地研究了各种增强对比对比的心电图(ECG)信号学习的有效性,并确定了最佳参数。我们提议的自我监督框架的基线包括两个主要部分:对比学习和下游任务。在第一阶段,我们使用许多增强量来训练编码器,以提取可通用的ECG信号表示。然后,我们将编码器和芬太日冻结一些线性层,其中具有不同量的标记数据以进行下游心律失常检测。然后,我们尝试各种增强技术并探索一系列参数。我们的实验是在PTB-XL上完成的,PTB-XL是一个大型且公开可用的12铅ECG数据集。结果表明,在特定的复杂性范围内应用增强功能可以更好地适用于自我监督的对比学习。例如,当添加高斯噪声时,在0.1到0.2的范围内的Sigma取得了更好的结果,而当添加的噪声太小或太大时(在指定范围之外)时,训练会发生不良。与其他扩展相似,观察到类似的趋势,表明选择最佳的难度水平的重要性,因为太简单的增强不会导致有效的培训,而太困难的增强也将阻止模型有效学习广义表示。我们的工作可以影响对生物信号的自我监督对比学习的未来研究,并有助于为不同的增强选择最佳参数。
This paper systematically investigates the effectiveness of various augmentations for contrastive self-supervised learning of electrocardiogram (ECG) signals and identifies the best parameters. The baseline of our proposed self-supervised framework consists of two main parts: the contrastive learning and the downstream task. In the first stage, we train an encoder using a number of augmentations to extract generalizable ECG signal representations. We then freeze the encoder and finetune a few linear layers with different amounts of labelled data for downstream arrhythmia detection. We then experiment with various augmentations techniques and explore a range of parameters. Our experiments are done on PTB-XL, a large and publicly available 12-lead ECG dataset. The results show that applying augmentations in a specific range of complexities works better for self-supervised contrastive learning. For instance, when adding Gaussian noise, a sigma in the range of 0.1 to 0.2 achieves better results, while poor training occurs when the added noise is too small or too large (outside of the specified range). A similar trend is observed with other augmentations, demonstrating the importance of selecting the optimum level of difficulty for the added augmentations, as augmentations that are too simple will not result in effective training, while augmentations that are too difficult will also prevent the model from effective learning of generalized representations. Our work can influence future research on self-supervised contrastive learning on bio-signals and aid in selecting optimum parameters for different augmentations.