Index of papers in Proc. ACL 2010 that mention
  • Gibbs sampling
Ritter, Alan and Mausam and Etzioni, Oren
Experiments
Next we used collapsed Gibbs sampling to infer a distribution over topics, 6?, for each of the relations in the primary corpus (based solely on tuples in the training set) using the topics from the generalization corpus.
Experiments
To evaluate how well our topic-class associations carry over to unseen relations we used the same random sample of 100 relations from the pseudo-disambiguation experiment.8 For each argument of each relation we picked the top two topics according to frequency in the 5 Gibbs samples .
Previous Work
Additionally we perform full Bayesian inference using collapsed Gibbs sampling , in which parameters are integrated out (Griffiths and Steyvers, 2004).
Topic Models for Selectional Prefs.
For all the models we use collapsed Gibbs sampling for inference in which each of the hidden variables (e. g., 27.,“ and 273,32 in LinkLDA) are sampled sequentially conditioned on a full-assignment to all others, integrating out the parameters (Griffiths and Steyvers, 2004).
Topic Models for Selectional Prefs.
In addition, there are several scalability enhancements such as SparseLDA (Yao et al., 2009), and an approximation of the Gibbs Sampling procedure can be efficiently parallelized (Newman et al., 2009).
Gibbs sampling is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Yamangil, Elif and Shieber, Stuart M.
Abstract
We formalize nonparametric Bayesian STSG with epsilon alignment in full generality, and provide a Gibbs sampling algorithm for posterior inference tailored to the task of extractive sentence compression.
Evaluation
We compared the Gibbs sampling compressor (GS) against a version of maximum a posteriori EM (with Dirichlet parameter greater than 1) and a discriminative STSG based on SVM training (Cohn and Lapata, 2008) (SVM).
The STSG Model
3.2 Posterior inference via Gibbs sampling
The STSG Model
We use Gibbs sampling (Geman and Geman, 1984), a Markov chain Monte Carlo (MCMC) method, to sample from the posterior (3).
Gibbs sampling is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Ó Séaghdha, Diarmuid
Related work
The combination of a well-defined probabilistic model and Gibbs sampling procedure for estimation guarantee (eventual) convergence and the avoidance of degenerate solutions.
Three selectional preference models
Following Griffiths and Steyvers (2004), we estimate the model by Gibbs sampling .
Three selectional preference models
As suggested by the similarity between (4) and (2), the ROOTH-LDA model can be estimated by an LDA-like Gibbs sampling procedure.
Gibbs sampling is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: