论文标题

较高跨域时间序列建模的残留注意网

Residual Attention Net for Superior Cross-Domain Time Sequence Modeling

论文作者

Huang, Seth H., Lingjie, Xu, Congwei, Jiang

论文摘要

我们提出了一种新颖的体系结构,残留的注意网(RAN),该结构将序列体系结构,通用变压器和计算机视觉架构(残留网络)与跨域序列建模的高速架构合并。该体系结构旨在解决基于复发性神经网络结构通常面临的长期依赖问题。本文是新体系结构的概念验证,旨在为模型提供对序列模式的更高层​​次的理解。据我们所知,我们是第一个提出这种建筑的人。在标准的85个UCR数据集中,我们已经实现了35个最先进的结果,其中10个结果与当前的最新结果匹配,而没有进一步的模型进行微调。结果表明,这种体系结构在复杂的,长期的建模中有希望,并且可能具有庞大的跨域应用。

We present a novel architecture, residual attention net (RAN), which merges a sequence architecture, universal transformer, and a computer vision architecture, residual net, with a high-way architecture for cross-domain sequence modeling. The architecture aims at addressing the long dependency issue often faced by recurrent-neural-net-based structures. This paper serves as a proof-of-concept for a new architecture, with RAN aiming at providing the model a higher level understanding of sequence patterns. To our best knowledge, we are the first to propose such an architecture. Out of the standard 85 UCR data sets, we have achieved 35 state-of-the-art results with 10 results matching current state-of-the-art results without further model fine-tuning. The results indicate that such architecture is promising in complex, long-sequence modeling and may have vast, cross-domain applications.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源