Index of papers in Proc. ACL 2009 that mention
  • dynamic programming
Snyder, Benjamin and Naseem, Tahira and Barzilay, Regina
Abstract
We perform inference under this model using Markov Chain Monte Carlo and dynamic programming .
Introduction
Additionally, a computational advantage of this formalism is that the marginalized probability over all possible alignments for any two trees can be efficiently computed with a dynamic program in linear time.
Introduction
We sample pairs of trees and then compute marginalized probabilities over all possible alignments using dynamic programming .
Model
Fortunately, for any given pair of trees T1 and T2 this marginalization can be computed using a dynamic program in time O(|T1||T2|).
Model
A dynamic program builds this table from the bottom up: For each node pair 711,712, we sum the probabilities of all local alignment configurations, each multiplied by the appro-
Model
This algorithm is an adaptation of the dynamic program presented in (J iang et al., 1995) for finding minimum cost alignment trees (Fig.
dynamic programming is mentioned in 7 sentences in this paper.
Topics mentioned in this paper:
Mochihashi, Daichi and Yamada, Takeshi and Ueda, Naonori
Abstract
In this paper, we propose a new Bayesian model for fully unsupervised word segmentation and an efficient blocked Gibbs sampler combined with dynamic programming for inference.
Conclusion
With a combination of dynamic programming and an accurate spelling model from a Bayesian perspective, our model significantly outperforms the previous reported results, and the inference is very efficient.
Inference
Instead, we propose a sentence-wise Gibbs sampler of word segmentation using efficient dynamic programming , as shown in Figure 3.
Introduction
Section 4 describes an efficient blocked Gibbs sampler that leverages dynamic programming for inference.
Nested Pitman-Yor Language Model
To segment a string into “words”, we used efficient dynamic programming combined with MCMC, as described in the next section.
dynamic programming is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Das, Dipanjan and Smith, Noah A.
QG for Paraphrase Modeling
We next describe a dynamic programming solution for calculating p(7't | Gp(7's)).
QG for Paraphrase Modeling
3.3 Dynamic Programming
QG for Paraphrase Modeling
Thus every word generated under G0 aligns to null, and we can simplify the dynamic programming algorithm that scores a tree 7'5 under G0:
dynamic programming is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Li, Zhifei and Eisner, Jason and Khudanpur, Sanjeev
Variational Approximate Decoding
The trick is to parameterize q as a factorized distribution such that the estimation of q* and decoding using q* are both tractable through efficient dynamic programs .
Variational Approximate Decoding
(2008) effectively do take n = 00, by maintaining the whole translation string in the dynamic programming state.
Variational Approximate Decoding
Figure 4: Dynamic programming estimation of q*.
dynamic programming is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: