Index of papers in Proc. ACL 2013 that mention
  • probabilistic model
Cheung, Jackie Chi Kit and Penn, Gerald
Abstract
Generative probabilistic models have been used for content modelling and template induction, and are typically trained on small corpora in the target domain.
Conclusion
We have shown that contextualized distributional semantic vectors can be successfully integrated into a generative probabilistic model for domain modelling, as demonstrated by improvements in slot induction and multi-document summarization.
Introduction
Generative probabilistic models have been one popular approach to content modelling.
Introduction
In this paper, we propose to inject contextualized distributional semantic vectors into generative probabilistic models , in order to combine their complementary strengths for domain modelling.
Introduction
First, they provide domain-general representations of word meaning that cannot be reliably estimated from the small target-domain corpora on which probabilistic models are trained.
Related Work
(2013) propose PROFINDER, a probabilistic model for frame induction inspired by content models.
Related Work
Our work is similar in that we assume much of the same structure within a domain and consequently in the model as well (Section 3), but whereas PROFINDER focuses on finding the “correct” number of frames, events, and slots with a nonparametric method, this work focuses on integrating global knowledge in the form of distributional semantics into a probabilistic model .
Related Work
Combining distributional information and probabilistic models has actually been explored in previous work.
probabilistic model is mentioned in 12 sentences in this paper.
Topics mentioned in this paper:
Swanson, Ben and Yamangil, Elif and Charniak, Eugene and Shieber, Stuart
Abstract
We provide a transformation to context-free form, and a further reduction in probabilistic model size through factorization and pooling of parameters.
Abstract
We perform parsing experiments the Penn Treebank and draw comparisons to Tree-Substitution Grammars and between different variations in probabilistic model design.
Applications
In this section we present a probabilistic model for an OSTAG grammar in PCFG form that can be used in such algorithms, and show that many parameters of this PCFG can be pooled or set equal to one and ignored.
Introduction
Using a context-free language model with proper phrase bracketing, the connection between the words pretzels and thirsty must be recorded with three separate patterns, which can lead to poor generalizability and unreliable sparse frequency estimates in probabilistic models .
Introduction
Using an automatically induced Tree-Substitution Grammar (TSG), we heuristically extract an OSTAG and estimate its parameters from data using models with various reduced probabilistic models of adjunction.
TAG and Variants
A simple probabilistic model for a TSG is a set of multinomials, one for each nonterminal in N corresponding to its possible substitutions in R. A more flexible model allows a potentially infinite number of substitution rules using a Dirichlet Process (Cohn et al., 2009; Cohn and Blunsom, 2010).
TAG and Variants
As such, probabilistic modeling for TAG in its original form is uncommon.
TAG and Variants
Several probabilistic models have been proposed for TIG.
Transformation to CFG
To avoid double-counting derivations, which can adversely effect probabilistic modeling , type (3) and type (4) rules in which the side with the unapplied symbol is a nonterminal leaf can be omitted.
probabilistic model is mentioned in 10 sentences in this paper.
Topics mentioned in this paper:
Goldwasser, Dan and Roth, Dan
Experimental Settings
DOM-INIT W1: Noisy probabilistic model , described below.
Experimental Settings
We used the noisy Robocup dataset to initialize DOM-INIT, a noisy probabilistic model , constructed by taking statistics over the noisy robocup data and computing p(y|X).
Knowledge Transfer Experiments
Domain-independent information is learned from the situated domain and domain-specific information (Robocup) available is the simple probabilistic model (DOM-INIT).
probabilistic model is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Nakashole, Ndapandula and Tylenda, Tomasz and Weikum, Gerhard
Abstract
Our method is based on a probabilistic model that feeds weights into integer linear programs that leverage type signatures of relational phrases and type correlation or disj ointness constraints.
Evaluation
The second method is PEARL with no ILP (denoted No ILP), only using the probabilistic model .
Introduction
For cleaning out false hypotheses among the type candidates for a new entity, we devised probabilistic models and an integer linear program that considers incompatibilities and correlations among entity types.
probabilistic model is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Reichart, Roi and Korhonen, Anna
Abstract
We show how to utilize Determinantal Point Processes (DPPs), elegant probabilistic models that are defined over the possible subsets of a given dataset and give higher probability mass to high quality and diverse subsets, for clustering.
Introduction
Our framework is based on Determinantal Point Processes (DPPs, (Kulesza, 2012; Kulesza and Taskar, 2012c)), elegant probabilistic models that are defined over the possible subsets of a given dataset and give higher probability mass to high quality and diverse subsets.
The Unified Framework
Determinantal point processes (DPPs) are elegant probabilistic models of repulsion that offer efficient and exact algorithms for sampling, marginalization, conditioning, and other inference tasks.
probabilistic model is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Schoenemann, Thomas
Conclusion
variants, an important aim of probabilistic modeling for word alignment.
The models IBM-3, IBM-4 and IBM-5
H221 k denotes the factorial of n. The main difference between IBM-3, IBM-4 and IBM-5 is the choice of probability model in step 3 b), called a distortion model.
The models IBM-3, IBM-4 and IBM-5
In total, the IBM-3 has the following probability model:
probabilistic model is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: