Index of papers in April 2015 that mention
  • mutual information
Ross S. Williamson, Maneesh Sahani, Jonathan W. Pillow
DKL<p(x|r = 0) p(x)) is the information (per spike) carried by silences, and
Thus, for example, if 20% of the bins in a binary spike train contain a spike, the standard MID estimator will necessarily neglect at least 10% of the total mutual information .
Distributional assumptions implicit in MID
4, MID does not maximize the mutual information between the projected stimulus and the spike response when the distribution of spikes conditioned on stimuli is not Poisson; it is an inconsistent estimator for the relevant stimulus subspace in such cases.
Generalizations
For both models, we obtained an equivalent relationship between log-likelihood and an estimate of mutual information between stimulus and response.
Methods
In the limit of small p (i.e., the Poisson limit), the mutual information is dominated by the vergence between Q0 and Q1 is obtained when p =
Models with Bernoulli spiking
To show this, we derive the mutual information between the stimulus and a Bernoulli distributed spike count, and show that this quantity is closely related to the log-likelihood under a linear-nonlin-ear-Bernoulli encoding model.
Models with Bernoulli spiking
The mutual information between the projected stimulus
Models with arbitrary spike count distributions
As we Will see, maximizing the mutual information based on histogram estimators is once again equivalent to maximizing the likelihood of an LN model With pieceWise constant mappings from the linear stimulus projection to count probabilities.
minimum information loss for binary spiking
If the binned spike-counts rt measured in response to stimuli st are not Poisson distributed, the projection matrix K which maximizes the mutual information between KTs and r can be found as follows.
minimum information loss for binary spiking
To ease comparison With the single-spike information, Which is measured in bits per spike, we normalize the mutual information by the mean spike count to obtain: ized information carried by the j-spike responses.
minimum information loss for binary spiking
We can estimate the mutual information from data using a histogram based plugin estimator:
mutual information is mentioned in 14 sentences in this paper.
Topics mentioned in this paper:
William F. Flynn, Max W. Chang, Zhiqiang Tan, Glenn Oliveira, Jinyun Yuan, Jason F. Okulicz, Bruce E. Torbett, Ronald M. Levy
Abstract
To examine covariation of mutations between two different sites using deep sequencing data, we developed an approach to estimate the tight bounds on the two-site bivariate probabilities in each viral sample, and the mutual information between pairs of positions based on all the bounds.
Correlation analysis in using bound estimates protease captures known pair correlations
Shown is a plot of the precision (also known as positive predictive value (PPV)) forthe top 40% of correlated PR-PR pairs ranked by mutual information .
Correlation analysis in using bound estimates protease captures known pair correlations
True positives were determined through a mutual information calculation similar to the calculations in [3].
Covariation of mutations in Gag-protease proteins
In the second step, using these joint probabilities, we assess the correlation implicit in the joint probabilities involving two positions in the viral genome using mutual information (MI).
Covariation of mutations in Gag-protease proteins
We quantify this deviation by the mutual information (MI); pairs with the largest mutual information have the strongest covariation.
Discussion
The strongest 50 correlations as measured by mutual information (Ml) from the following regions are shown: Gag-Gag (blue), Gag-PR (red).
Discussion
There exist several methods, some based on mutual information , which have been developed to extract direct structural contacts (typically <8A) from multiple sequence alignments [37,59—63]; it is possible to adapt these methods to detect direct structural propensities using covariation extracted from deep sequencing.
Introduction
This then allowed us to utilize mutual information (M1) to calculate the pair correlations between pairs of positions in gag.
Strongest correlations in Gag indicate functional and structural patterns
While Gag is not the primary target of protease inhibitors, we observe the correlations as measured by mutual information within Gag proteins are of similar magnitudes as in protease (Tables 2, 3, 4).
Strongest correlations in Gag indicate functional and structural patterns
We find that the pair of positions with the largest mutual information identified in this study, Gag M228-G248, is within 6A in all CA structures.
Validation of bivariate marginal estimation
The mutual information (MI) computed for each pair using the bounding procedure is in good agreement with the MI computed using the known bivariate marginal probabilities, as shown in Fig 4.
mutual information is mentioned in 18 sentences in this paper.
Topics mentioned in this paper:
Naoki Hiratani, Tomoki Fukai
Evaluation of the performance
Mutual information .
Evaluation of the performance
Therefore, mutual information can be defined as
Lateral inhibition enhances minor source detection by STDP
The same argument holds if mutual information is used for performance evaluation (green lines in Fig 2D and 2E).
Model
Both cross-correlation and mutual information behave as they do in the Poisson model, but the performance is slightly better, possibly because the dynamics are deterministic (Fig ID and IE, SIB and SIC Fig); however, membrane potentials show different responses for correlation events (SID Fig) because output neurons are constantly in high-conductance states, so that correlation events immediately cause spikes.
Supporting Information
(B) Cross-correlation and mutual information calculated for various delays.
mutual information is mentioned in 5 sentences in this paper.
Topics mentioned in this paper: