论文标题
随着时间的流逝,开发受约束的神经单位
Developing Constrained Neural Units Over Time
论文作者
论文摘要
在本文中,我们介绍了一项关于约束方法的基础研究,该研究在最少认知动作的原则的背景下定义了神经网络的学习问题,这与机械行动中最少动作的原理非常相似。从一般方法开始对学习的动态定律进行约束,这项工作着重于定义神经网络的替代方法,这与大多数现有方法不同。特别是,神经体系结构的结构是通过一类特殊的约束来定义的,这些约束也分别扩展到与数据相互作用的相互作用,分别导致“体系结构”和“输入相关”的约束。提出的理论被施加到时域中,其中数据以有序的方式向网络呈现,这使得这项研究成为朝着处理连续的数据流与神经网络的替代方式的重要一步。讨论了与网络权重的基于经典反向传播的更新规则的连接,这表明我们的方法在某些情况下退化为反向传播。此外,该理论是对一个简单问题的实验评估,该问题使我们能够深入研究理论本身的几个方面并显示模型的健全性。
In this paper we present a foundational study on a constrained method that defines learning problems with Neural Networks in the context of the principle of least cognitive action, which very much resembles the principle of least action in mechanics. Starting from a general approach to enforce constraints into the dynamical laws of learning, this work focuses on an alternative way of defining Neural Networks, that is different from the majority of existing approaches. In particular, the structure of the neural architecture is defined by means of a special class of constraints that are extended also to the interaction with data, leading to "architectural" and "input-related" constraints, respectively. The proposed theory is cast into the time domain, in which data are presented to the network in an ordered manner, that makes this study an important step toward alternative ways of processing continuous streams of data with Neural Networks. The connection with the classic Backpropagation-based update rule of the weights of networks is discussed, showing that there are conditions under which our approach degenerates to Backpropagation. Moreover, the theory is experimentally evaluated on a simple problem that allows us to deeply study several aspects of the theory itself and to show the soundness of the model.