论文标题

生长的树突增强神经元的计算能力和记忆能力

Growing dendrites enhance a neuron's computational power and memory capacity

论文作者

Levy, William B, Baxter, Robert A.

论文摘要

新皮层锥体神经元具有许多树突,并且这种树突能够分离单个单个,产生神经元尖峰。现在也可以理解,在人类生活的头几年,可以说是一段奇妙的学习时期。这些观察结果激发了基于早期随机的Hebbian发展理论的局部随机算法的构建。在这里,我们研究了这种新型算法的神经计算优势和限制,该算法将树突生成与监督的适应性突触发生结合在一起。使用该算法产生的神经元具有增强的记忆能力,可以避免灾难性干扰(忘记),并具有解开混合物分布的能力。特别是,每个类别在每个类别中以无监督的方式发展成为特征群体,以对应于类 - 条件混合物分布的混合元素。尽管歧视性问题用于了解其产生的随机算法和神经元连接性的能力,但该算法在生成类中,因此似乎是需要概括的决策,即超出先前学习的推断。

Neocortical pyramidal neurons have many dendrites, and such dendrites are capable of, in isolation of one-another, generating a neuronal spike. It is also now understood that there is a large amount of dendritic growth during the first years of a humans life, arguably a period of prodigious learning. These observations inspire the construction of a local, stochastic algorithm based on an earlier stochastic, Hebbian developmental theory. Here we investigate the neuro-computational advantages and limits on this novel algorithm that combines dendritogenesis with supervised adaptive synaptogenesis. Neurons created with this algorithm have enhanced memory capacity, can avoid catastrophic interference (forgetting), and have the ability to unmix mixture distributions. In particular, individual dendrites develop within each class, in an unsupervised manner, to become feature-clusters that correspond to the mixing elements of class-conditional mixture distribution. Although discriminative problems are used to understand the capabilities of the stochastic algorithm and the neuronal connectivity it produces, the algorithm is in the generative class, it thus seems ideal for decisions that require generalization, i.e., extrapolation beyond previous learning.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源