论文标题

混合对比度学习:时间序列的自我监督的代表学习

Mixing Up Contrastive Learning: Self-Supervised Representation Learning for Time Series

论文作者

Wickstrøm, Kristoffer, Kampffmeyer, Michael, Mikalsen, Karl Øyvind, Jenssen, Robert

论文摘要

缺乏标记数据是学习时间序列数据有用表示的关键挑战。但是,能够产生高质量表示的无监督表示框架可能具有很大的价值。这是启用转移学习的关键,这对医疗应用特别有益,那里有大量数据,但标签是昂贵且耗时的。我们提出了一个无监督的对比学习框架,该框架是从标签平滑的角度进行的。所提出的方法使用了一种新颖的对比损失,该损失自然利用了数据增强方案,在该方案中,通过将两个数据样本与混合组件混合在一起,从而生成新样品。提出的框架中的任务是预测混合组件,该组件在损耗函数中用作软目标。与单变量和多变量时间序列的其他表示学习方法相比,实验证明了该框架的出色性能,并说明了其在临床时间序列中转移学习的好处。

The lack of labeled data is a key challenge for learning useful representation from time series data. However, an unsupervised representation framework that is capable of producing high quality representations could be of great value. It is key to enabling transfer learning, which is especially beneficial for medical applications, where there is an abundance of data but labeling is costly and time consuming. We propose an unsupervised contrastive learning framework that is motivated from the perspective of label smoothing. The proposed approach uses a novel contrastive loss that naturally exploits a data augmentation scheme in which new samples are generated by mixing two data samples with a mixing component. The task in the proposed framework is to predict the mixing component, which is utilized as soft targets in the loss function. Experiments demonstrate the framework's superior performance compared to other representation learning approaches on both univariate and multivariate time series and illustrate its benefits for transfer learning for clinical time series.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源