论文标题
神经网络学习的不可行视图
An unfeasability view of neural network learning
论文作者
论文摘要
我们为多层神经网络体系结构定义了连续可区分的完美学习算法的概念,并表明如果数据集的长度超过涉及参数的数量,并且激活函数是逻辑,Tanh或sin。
We define the notion of a continuously differentiable perfect learning algorithm for multilayer neural network architectures and show that such algorithms don't exist provided that the length of the data set exceeds the number of involved parameters and the activation functions are logistic, tanh or sin.