Bayesian Spatial Point Process Classification Model | The BSPP model estimates the posterior distribution for reported foci across studies. |
Emotional Signatures Across Networks and Regions of Interest | In addition, the matrix MC contains samples from the joint posterior distribution of regional intensity values that can be used for visualization and statistical inference. |
Emotional Signatures Across Networks and Regions of Interest | As the elements of M are samples from the joint posterior distribution of intensity values, statistical inference on the difference 171d depends on its statistical distance from the origin, which is assessed by examining the proportion P of the samples that lie on the opposite side of the origin from 171,1, adjusting for the fact that the mean |
The Bayesian Spatial Point Process (BSPP) Model | To develop a model for emotion categories and test its accuracy in diagnosing the emotions being cultivated in specific studies, we constructed a generative, Bayesian Spatial Point Process (BSPP) model of the joint posterior distribution of peak activation locations over the brain for each emotion category (see Methods and [38]). |
The Bayesian Spatial Point Process (BSPP) Model | The MCMC procedure draws samples from the joint posterior distribution of the number and locations of peak activations in the brain given an emotion category. |
The Bayesian Spatial Point Process (BSPP) Model | The posterior distribution is summarized in part by the in tensity function map representing the spatial posterior eXpected number of activation or population centers in each area across the brain given the emotion category; this can be used to interpret the activation pattern characteristic of an emotion category (Fig. |
Bayesian parameter estimation and model comparison | The BMC approach uses the table of model evidence values (subjects x models; see S2 Table) to estimate a posterior distribution over r,-. |
Bayesian parameter estimation and model comparison | The mean of this posterior distribution , mp(i), is our best estimate of ri. |
Bayesian parameter estimation and model comparison | Given a table of model evidence values (see S2 Table) an algorithm [101,102] can be derived for computing a posterior distribution over ri, from which subsequent inferences can be made. |
Exerted force (% MVC) n | The resulting exceedance probability (xp) indicates the likelihood of each model to be the most frequently occurring model in the population, and the mean of the posterior distribution (mp) provides an estimate of the frequency with which the model appears in the population. |
Interface Identification and Clustering | We defined WC 2 exp(aC), used noninformative normal priors (with zero mean and large standard deviation) for the parameters ac, and employed Gibbs sampling to obtain the posterior distributions of pC, for which we report the average as well as (2.5%,97.5%) confidence intervals. |
Interface Identification and Clustering | Again, we defined the parameters 190- so that the Poisson intensity is (t kon ci) 2 exp(bCi), where 190- have non-infor-mative normal priors, and sampled the posterior distribution of kon using Gibbs sampling. |
Interface Identification and Clustering | For all estimates, 104 samples from the posterior distributions were obtained after a 5><103 burn-in phase using Markov-chain Monte Carlo techniques [44]. |
Position-Dependent Diffusion Coefficients | The diffusion coefficient D was calculated by sampling the posterior distribution of the rate matriX from the posterior distribution p(K|X) = p(X|K) p(K), assuming a uniform prior p(K), and the likelihood |