Author Summary | Such basic geometric requirement, which was explicitly recognized in Donald Hebb’s original formulation of synaptic plasticity, is not usually accounted for in neural network learning rules. |
BIG Learning in Small-World Graphs: Ability to Differentiate Real from Spurious Associations | Hebbian models form both associations, relying on later experience to reinforce those that reoccur and eliminating the others [12] , e.g. |
Discussion | Such basic geometric requirement was explicitly recognized in Hebb’s original formulation of synaptic plasticity, yet is not usually accounted for in neural network learning rules. |
Introduction | In order to establish a synapse, according to Hebbian theory, the axon and dendrites of the two co-activated neurons must be juxtaposed [7]. |
Neural Network Model and the BIG ADO Learning Rule | Activity-dependent plasticity is traditionally framed in terms of the Hebbian rule: “When an axon of cell a is near enough to excite cell (9 and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that a’s efficiency, as one of the cells f1ring b, is increased” [7]. |
Neural Network Model and the BIG ADO Learning Rule | Many variants of Hebbian synaptic modification exist [12], often summarized as ‘neurons that fire together wire together’. |
Neural Network Model and the BIG ADO Learning Rule | This popular quip, however, misses the essential requirement, clearly stressed in Hebb’s original formulation, that the axon of the pre-synaptic neuron must be sufficiently close to its post-synaptic target for plasticity to take place. |
Robustness Analysis and Optimal Conditions | This is the key parameter distinguishing BIG ADO from traditional Hebbian learning: a new synapse is formed between two neurons when they fire together and only a potential synapse is already present. |