Index of papers in Proc. ACL 2009 that mention
  • generative model
Ganchev, Kuzman and Gillenwater, Jennifer and Taskar, Ben
Experiments
Generative model
Experiments
We found that the generative model gets confused by punctuation and tends to predict that periods at the end of sentences are the parents of words in the sentence.
Experiments
We call the generic model described above “no-rules” to distinguish it from the language-specific constraints we introduce in the sequel.
Parsing Models
We explored two parsing models: a generative model used by several authors for unsupervised induction and a discriminative model used for fully supervised training.
Parsing Models
We also used a generative model based on dependency model with valence (Klein and Manning, 2004).
generative model is mentioned in 11 sentences in this paper.
Topics mentioned in this paper:
Kim, Jungi and Li, Jin-Ji and Lee, Jong-Hyeok
Experiment
We observe that the features of our word generation model is more effective than those of the topic association model.
Experiment
Among the features of the word generation model , the most improvement was achieved with BM 25, improving the MAP by 2.27%.
Experiment
Since BM25 performs the best among the word generation models , its combination with other features was investigated.
Term Weighting and Sentiment Analysis
3.2.3 Word Generation Model
Term Weighting and Sentiment Analysis
Our word generation model p(w | d) evaluates the prominence and the discriminativeness of a word
Term Weighting and Sentiment Analysis
Therefore, we estimate the word generation model with popular IR models’ the relevance scores of a document d given 212 as a query.5
generative model is mentioned in 12 sentences in this paper.
Topics mentioned in this paper:
Liang, Percy and Jordan, Michael and Klein, Dan
Abstract
To deal with the high degree of ambiguity present in this setting, we present a generative model that simultaneously segments the text into utterances and maps each utterance to a meaning representation grounded in the world state.
Conclusion
We have presented a generative model of correspondences between a world state and an unsegmented stream of text.
Generative Model
To learn the correspondence between a text w and a world state s, we propose a generative model p(w | s) with latent variables specifying this correspondence.
Generative Model
We used a simple generic model of rendering string fields: Let U) be a word chosen uniformly from those in v.
Introduction
To cope with these challenges, we propose a probabilistic generative model that treats text segmentation, fact identification, and alignment in a single unified framework.
generative model is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Weerkamp, Wouter and Balog, Krisztian and de Rijke, Maarten
Abstract
We propose a generative model for expanding queries using external collections in which dependencies between queries, documents, and expansion documents are explicitly modeled.
Discussion
Theoretically, the main difference between these two instantiations of our general model is that EEM3 makes much stronger simplifying indepence assumptions than EEM1.
Introduction
Our aim in this paper is to define and evaluate generative models for expanding queries using external collections.
Related Work
As will become clear in §4, Diaz and Metzler’s approach is an instantiation of our general model for external expansion.
Related Work
We are driven by the same motivation, but where they considered rank-based result combinations and simple mixtures of query models, we take a more principled and structured approach, and develop four versions of a generative model for query expansion using external collections.
generative model is mentioned in 5 sentences in this paper.
Topics mentioned in this paper: