Index of papers in March 2015 that mention
  • synaptic weights
Maxim Volgushev, Vladimir Ilin, Ian H. Stevenson
Abstract
In a fully-defined input paradigm, we than control the synaptic weights and timing of many simulated presynaptic neurons.
Author Summary
Synapses play a central role in neural information processing — weighting individual inputs in different ways allows neurons to perform a range of computations, and the changing of synaptic weights over time allows learning and recovery from injury.
Discussion
4) Detectability of changes of synaptic weights follows same rules and has same limitations as detection of individual synaptic connections.
Introduction
Tools for identifying synaptic weights and tracking their changes, thus, play a key role in understanding neural information processing.
Introduction
Traditionally, synaptic integration and plasticity are studied using intracellular recordings in vitro, where synaptic weights can be directly measured as the amplitude of postsynaptic potentials or currents.
Introduction
We ask how well synaptic inputs of different amplitudes can be detected, how much data is necessary to reconstruct the amplitudes of excitatory and inhibitory synaptic inputs, and how precisely synaptic weights can be estimated from spikes alone.
U
As in previous analysis, we use the likelihood ratio to determine whether the synaptic weight has changed.
U
Detectability of synaptic weight changes in long recordings is comparable to the detectability of connections of constant strength.
input experiments.
In the fully-defined input setting, we can examine, in a single cell, how accurately model estimates of synaptic weights (of different amplitude and sign) capture the actual values.
input experiments.
The increased accuracy in estimating synaptic weights using models of all inputs suggests that postsynaptic spikes might be more readily associated with or disassociated from the spiking of individual presynaptic inputs.
synaptic weights is mentioned in 19 sentences in this paper.
Topics mentioned in this paper:
Jaldert O. Rombouts, Sander M. Bohte, Pieter R. Roelfsema
Comparison to previous modeling approaches
It thereby alleviates a limitation of many previous biologically plausible RL models, which can only train a single layer of modifiable synaptic weights and solve linear tasks [16,21,44,67,70,71,73,76] and binary decisions [21,44,67,70].
Probabilistic decision making task
Importantly, the synaptic weights from input neurons to memory cells depended on the true weights of the symbols after learning (Fig.
Results
As in other SARSA methods, the updating of synaptic weights is only performed for the transitions that the network actually experiences.
Results
We will first establish the equivalence of online gradient descent defined in Equation (19) and the AuGMEnT learning rule for the synaptic weights wflt) from the regular units onto the time step t-1 should change as: leaving the other weights k7éa unchanged.
Results
We conclude that AuGMEnT causes an online gradient descent on all synaptic weights to minimize the temporal difference error if a = 1.
synaptic weights is mentioned in 5 sentences in this paper.
Topics mentioned in this paper: