Index of papers in Proc. ACL that mention
  • generative process
Özbal, Gözde and Pighin, Daniele and Strapparava, Carlo
Abstract
We present BRAINSUP, an extensible framework for the generation of creative sentences in which users are able to force several words to appear in the sentences and to control the generation process across several semantic dimensions, namely emotions, colors, domain relatedness and phonetic properties.
Architecture of BRAINSUP
More formally, the user input is a tuple: U = (t,d, c, e,p,w> , where t is the set of target words, (1 is a set of words defining the target domain, 0 and p are, respectively, the color and the emotion towards which the user wants to slant the sentence, p represents the desired phonetic features, and w is a set of weights that control the influence of each dimension on the generative process , as detailed in Section 3.3.
Architecture of BRAINSUP
The sentence generation process is based on morpho-syntactic patterns which we automatically discover from a corpus of dependency parsed sentences ’P.
Architecture of BRAINSUP
Algorithm 1 provides a high-level description of the creative sentence generation process .
Evaluation
We did so to simulate the brainstorming phase behind the slogan generation process , where copywriters start with a set of relevant keywords to come up with a catchy slogan.
Evaluation
Nonetheless, it is very encouraging to observe that the generation process does not deteriorate the positive impact of the input keywords.
generative process is mentioned in 9 sentences in this paper.
Topics mentioned in this paper:
Chen, Harr and Benson, Edward and Naseem, Tahira and Barzilay, Regina
Introduction
First, the model’s generative process encourages coherence in the local features and placement of relation instances.
Model
This section describes the generative process , while Sections 4 and 5 discuss declarative constraints.
Model
3.2 Generative Process
Model
There are three steps to the generative process .
generative process is mentioned in 6 sentences in this paper.
Topics mentioned in this paper:
Snyder, Benjamin and Naseem, Tahira and Barzilay, Regina
Model
Our model seeks to explain this observed data through a generative process whereby two aligned parse trees are produced jointly.
Model
In the next two sections, we describe our model in more formal detail by specifying the parameters and generative process by which sentences are formed.
Model
3.4 Generative Process
generative process is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Lee, Chia-ying and Glass, James
Model
Here, we describe the generative process our model uses to generate the observed utterances and present the corresponding graphical model.
Model
For clarity, we assume that the values of the boundary variables bi are given in the generative process .
Model
The generative process indicates that our model ignores utterance boundaries and views the entire data as concatenated spoken segments.
Problem Formulation
In the next section, we show the generative process our model uses to generate the observed data.
generative process is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Mukherjee, Arjun and Liu, Bing
Model
This observation motivates the generative process of our model.
Model
Referring to the notations in Table 1, we explain the generative process of JTE-P.
Model
We now detail the generative process of J TE-P (plate notation in Figure l) as follows:
Phrase Ranking based on Relevance
This thread of research models bigrams by encoding them into the generative process .
Related Work
The generative process of ASMs is, however, different from our model.
generative process is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Hu, Yuening and Boyd-Graber, Jordan and Satinoff, Brianna
Constraints Shape Topics
In LDA, a document’s token is produced in the generative process by choosing a topic 2 and sampling a word from the multinomial distribution gbz of topic 2.
Constraints Shape Topics
If that is an unconstrained word, the word is emitted and the generative process for that token is done.
Constraints Shape Topics
Then the generative process for constrained LDA is:
Putting Knowledge in Topic Models
introduce ambiguity over the path associated with an observed token in the generative process .
generative process is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Lei, Tao and Long, Fan and Barzilay, Regina and Rinard, Martin
Introduction
We model our problem as a joint dependency parsing and role labeling task, assuming a Bayesian generative process .
Model
Modeling the Generative Process .
Model
The generative process is described formally as follows:
Related Work
While previous approaches rely on the feedback to train a discriminative prediction model, our approach models a generative process to guide structure predictions when the feedback is noisy or unavailable.
generative process is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Yang, Min and Zhu, Dingju and Chow, Kam-Pui
Algorithm
The generative process of word distributions for non-emotion topics follows the standard LDA definition with a scalar hyperparameter 607’).
Algorithm
We summarize the generative process of the EaLDA model as below:
Algorithm
As an alternative representation, the graphical model of the the generative process is shown by Figure 1.
generative process is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
McIntyre, Neil and Lapata, Mirella
Introduction
Although they have limited vocabulary and non-elaborate syntax, they nevertheless present challenges at almost all stages of the generation process .
The Story Generator
the story generation process as a tree (see Figure 2) whose levels represent different story lengths.
The Story Generator
So, at each choice point in the generation process , e. g., when selecting a verb for an entity or a frame for a verb, we consider the N best alternatives assuming that these are most likely to appear in a good story.
generative process is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
McKinley, Nathan and Ray, Soumya
Empirical Evaluation
The times reported are from the start of the generation process , eliminating variations due to interpreter startup, input parsing, etc.
Empirical Evaluation
Note that, as STRUCT is an anytime algorithm, valid sentences are available very early in the generation process , despite the size of the set of adjoining trees.
Sentence Tree Realization with UCT
If so, we store it, and continue the generation process .
generative process is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Doyle, Gabriel and Bicknell, Klinton and Levy, Roger
The IBPOT Model
The IBPOT model defines a generative process for mappings between input and output forms based on three latent variables: the constraint violation matrices F (faithfulness) and M (markedness), and the weight vector w. The cells of the violation matrices correspond to the number of violations of a constraint by a given input-output mapping.
The IBPOT Model
Represented constraint sampling We begin by resampling M j; for all represented constraints M.l, conditioned on the rest of the violations (M_(jl), F) and the weights w. This is the sampling counterpart of drawing existing features in the IBP generative process .
The IBPOT Model
This is the sampling counterpart to the Poisson draw for new features in the IBP generative process .
generative process is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Andrews, Nicholas and Eisner, Jason and Dredze, Mark
Abstract
The generative process assumes that each entity mention arises from copying and optionally mutating an earlier name from a similar context.
Introduction
Our model is an evolutionary generative process based on the name variation model of Andrews et al.
Introduction
This can also relate seemingly dissimilar names via multiple steps in the generative process:
generative process is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Ravi, Sujith
Decipherment Model for Machine Translation
So, instead we use a simplified generative process for the translation model as proposed by Ravi and Knight (2011b) and used by others (Nuhn et al., 2012) for this task:
Decipherment Model for Machine Translation
We now extend the generative process (described earlier) to more complex translation models.
Decipherment Model for Machine Translation
Nonlocal Reordering: The generative process described earlier limits reordering to local or adjacent word pairs in a source sentence.
generative process is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Feng, Yang and Cohn, Trevor
Experiments
Decoding under our model would be straightforward in principle, as the generative process was designed to closely parallel the search procedure in the phrase-based model.3 Three data sets were used in the experiments: two Chinese to English data sets on small (IWSLT) and larger corpora (FBIS), and Arabic
Model
The generative process employs the following recursive procedure to construct the target sentence conditioned on the source:
Model
This generative process resembles the sequence of translation decisions considered by a standard MT decoder (Koehn et al., 2003), but note that our approach differs in that there is no constraint that all words are translated exactly once.
generative process is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Cohn, Trevor and Haffari, Gholamreza
Model
The generative process of the model follows that of ITG with the following simple grammar
Model
The generative process is that we draw a complete ITG tree, 75 N P2 as follows:
Model
depending on 7“ This generative process is mutually recursive: P2 makes draws from P1 and P1 makes draws from P2.
generative process is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Börschinger, Benjamin and Johnson, Mark and Demuth, Katherine
Discussion
Using a more realistic generative process for the underlying forms, for example an Adaptor Grammar (Johnson et al., 2007), could address this shortcoming in future work without changing the overall architecture of the model although novel inference algorithms might be required.
The computational model
This generative process is repeated for each utterance 2', leading to multiple utterances of the form Um, .
The computational model
The generative process mimics the intuitively plausible idea of generating underlying forms from some kind of syntactic model (here, a Bigram language model) and then mapping the underlying form to an observed surface-form through the application of a phonological rule component, here represented by the collection of rule probabilities pc.
generative process is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Kuznetsova, Polina and Ordonez, Vicente and Berg, Alexander and Berg, Tamara and Choi, Yejin
Abstract
We cast the generation process as constraint optimization problems, collectively incorporating multiple interconnected aspects of language composition for content planning, surface realization and discourse structure.
Introduction
Because the generation process sticks relatively closely to the recognized content, the resulting descriptions often lack the kind of coverage, creativity, and complexity typically found in human-written text.
Introduction
Our ILP formulation encodes a rich set of linguistically motivated constraints and weights that incorporate multiple aspects of the generation process .
generative process is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Diao, Qiming and Jiang, Jing and Zhu, Feida and Lim, Ee-Peng
Method
We assume the following generation process for all the posts in the stream.
Method
Figure 2: The generation process for all posts.
Method
Formally, the generation process is summarized in Figure 2.
generative process is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Celikyilmaz, Asli and Hakkani-Tur, Dilek
Abstract
We inject information extracted from unstructured web search query logs as prior information to enhance the generative process of the natural language utterance understanding model.
Experiments
This is because we utilize domain priors obtained from the web sources as supervision during generative process as well as unlabeled utterances that enable handling language variability.
MultiLayer Context Model - MCM
The generative process of our multilayer context model (MCM) (Fig.
generative process is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Celikyilmaz, Asli and Hakkani-Tur, Dilek
Enriched Two-Tiered Topic Model
Thus; we present enriched TTM (ETTM) generative process (Fig.3), which samples words not only from low-level topics but also from high-level topics as well.
Enriched Two-Tiered Topic Model
Similar to TTM’s generative process , if wid is related to a given query, then cc 2 1 is deterministic, otherwise cc 6 {0,1} is stochastically determined if wid should be sampled as a background word (2123) or through hierarchical path, i.e., HT pairs.
Topic Coherence for Summarization
We identify sentences as meta-variables of document clusters, which the generative process models both sentences and documents using tiered topics.
generative process is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Spiegler, Sebastian and Flach, Peter A.
Introduction
In Section 2 we introduce the probabilistic generative process and show in Sections 2.1 and 2.2 how we incorporate this process in PROMODES and PROMODES-H. We start our experiments with examining the learning behaviour of the algorithms in 31.
Probabilistic generative model
abilistic generative process consisting of words as observed variables X and their hidden segmentation as latent variables Y.
Related work
(2002), however, they were interested in finding paradigms as sets of mutual exclusive operations on a word form whereas we are describing a generative process using morpheme boundaries and resulting letter transitions.
generative process is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Snyder, Benjamin and Barzilay, Regina and Knight, Kevin
Model
There are four basic layers in the generative process:
Model
Structural Sparsity The first step of the generative process provides a control on the sparsity of edit-operation probabilities, encoding the linguistic intuition that the correct character-level mappings should be sparse.
Model
Character-edit Distribution The next step in the generative process is drawing a base distribution Go over character edit sequences (each of which yields a bilingual pair of morphemes).
generative process is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: