Index of papers in Proc. ACL 2012 that mention
  • Gibbs sampling
Yogatama, Dani and Sim, Yanchuan and Smith, Noah A.
Learning and Inference
In the E-step, we perform collapsed Gibbs sampling to obtain distributions over row and column indices for every mention, given the current value of the hyperparamaters.
Learning and Inference
Also, our model has interdependencies among column indices of a mention.2 Standard Gibbs sampling procedure breaks down these dependencies.
Learning and Inference
This kind of blocked Gibbs sampling was proposed by Jensen et al.
Gibbs sampling is mentioned in 6 sentences in this paper.
Topics mentioned in this paper:
Lee, Chia-ying and Glass, James
Inference
We employ Gibbs sampling (Gelman et al., 2004) to approximate the posterior distribution of the hidden variables in our model.
Inference
To apply Gibbs sampling to our problem, we need to derive the conditional posterior distributions of each hidden variable of the model.
Inference
2, the Gibbs sampler can draw a new value for CM by sampling from the normalized distribution.
Introduction
We implement the inference process using Gibbs sampling .
Gibbs sampling is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Shindo, Hiroyuki and Miyao, Yusuke and Fujino, Akinori and Nagata, Masaaki
Experiment
After that, to infer the substitution sites, we initialized the model with the final sample from a run on the small training set, and used the Gibbs sampler for 2000 iterations.
Inference
In each splitting step, we use two types of blocked MCMC algorithm: the sentence-level blocked Metroporil-Hastings (MH) sampler and the tree-level blocked Gibbs sampler , while (Petrov et al., 2006) use a different MLE-based model and the EM algorithm.
Inference
The tree-level blocked Gibbs sampler focuses on the type of SR-TSG rules and simultaneously up-
Inference
After the inference of symbol subcategories, we use Gibbs sampling to infer the substitution sites of parse trees as described in (Cohn and Lapata, 2009; Post and Gildea, 2009).
Gibbs sampling is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Celikyilmaz, Asli and Hakkani-Tur, Dilek
Experiments
For Base—MGM and WebPrior—MCM, we run Gibbs sampler for 2000 iterations with the first 500 samples as bum-in.
MultiLayer Context Model - MCM
Thus, we use Markov Chain Monte Carlo (MCMC) method,specifically Gibbs sampling , to model the posterior distribution Pym/(Du, Aud, Sujdla‘fi‘, a‘f, cuff, fl) by obtaining samples (Du, Aud, Sujd) drawn from this distribution.
MultiLayer Context Model - MCM
During Gibbs sampling , we keep track of the frequency of draws of domain, dialog act and slot indicating n-grams wj, in M D, M A and MS matrices, respectively.
Gibbs sampling is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Diao, Qiming and Jiang, Jing and Zhu, Feida and Lim, Ee-Peng
Experiments
Each model was run for 500 iterations of Gibbs sampling .
Method
We use collapsed Gibbs sampling to obtain samples of the hidden variable assignment and to estimate the model parameters from these samples.
Method
Due to space limit, we only show the derived Gibbs sampling formulas as follows.
Gibbs sampling is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Mukherjee, Arjun and Liu, Bing
Experiments
The variations in the results are due to the random initialization of the Gibbs sampler .
Proposed Seeded Models
We employ collapsed Gibbs sampling (Griffiths and Steyvers, 2004) for posterior inference.
Related Work
(2011) relied on user feedback during Gibbs sampling iterations.
Gibbs sampling is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: