论文标题

迭代过度参数的Sigmoid自动编码器中的关联内存

Associative Memory in Iterated Overparameterized Sigmoid Autoencoders

论文作者

Jiang, Yibo, Pehlevan, Cengiz

论文摘要

最近的工作表明,当训练有素的网络的输入输出输出雅各比式的所有特征值规范严格低于一个时,可以通过迭代地图训练过度参数化的自动编码器,从而通过迭代地图实现关联内存。在这里,我们从理论上通过利用深度学习理论的最新发展来分析Sigmoid网络的这种现象,尤其是在无限宽度限制中训练神经网络之间的对应关系,并使用神经切线核(NTK)进行内核回归。我们发现,过度参数化的Sigmoid自动编码器可以在NTK限制中以一个示例和在某些条件下的多个示例中具有吸引力。特别是,对于多个培训示例,我们发现最大的雅各布特征值的规范下降到一个下降,而输入规范增加,导致了关联记忆。

Recent work showed that overparameterized autoencoders can be trained to implement associative memory via iterative maps, when the trained input-output Jacobian of the network has all of its eigenvalue norms strictly below one. Here, we theoretically analyze this phenomenon for sigmoid networks by leveraging recent developments in deep learning theory, especially the correspondence between training neural networks in the infinite-width limit and performing kernel regression with the Neural Tangent Kernel (NTK). We find that overparameterized sigmoid autoencoders can have attractors in the NTK limit for both training with a single example and multiple examples under certain conditions. In particular, for multiple training examples, we find that the norm of the largest Jacobian eigenvalue drops below one with increasing input norm, leading to associative memory.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源