论文标题
RNN体系结构的正式层次结构
A Formal Hierarchy of RNN Architectures
论文作者
论文摘要
我们发展了RNN体系结构表达能力的正式层次结构。层次结构基于两个形式属性:空间复杂性,它测量了RNN的存储器和合理的复发,定义为是否可以通过加权有限态机器描述复发更新。我们在此层次结构中放置了几个RNN变体。例如,我们证明LSTM不是理性的,它将其正式与相关的QRNN分开(Bradbury等,2016)。我们还展示了这些模型的表达能力如何通过堆叠多层或使用不同的池功能来扩展。我们的结果基于“饱和” RNN的理论(Merrill,2019年)。虽然将这些发现正式扩展到不饱和的RNN,但我们假设不饱和RNN的实际学习能力遵守类似的层次结构。培训正式语言的不饱和网络的实验发现支持此猜想。
We develop a formal hierarchy of the expressive capacity of RNN architectures. The hierarchy is based on two formal properties: space complexity, which measures the RNN's memory, and rational recurrence, defined as whether the recurrent update can be described by a weighted finite-state machine. We place several RNN variants within this hierarchy. For example, we prove the LSTM is not rational, which formally separates it from the related QRNN (Bradbury et al., 2016). We also show how these models' expressive capacity is expanded by stacking multiple layers or composing them with different pooling functions. Our results build on the theory of "saturated" RNNs (Merrill, 2019). While formally extending these findings to unsaturated RNNs is left to future work, we hypothesize that the practical learnable capacity of unsaturated RNNs obeys a similar hierarchy. Experimental findings from training unsaturated networks on formal languages support this conjecture.