Index of papers in Proc. ACL 2013 that mention
  • generative process
Özbal, Gözde and Pighin, Daniele and Strapparava, Carlo
Abstract
We present BRAINSUP, an extensible framework for the generation of creative sentences in which users are able to force several words to appear in the sentences and to control the generation process across several semantic dimensions, namely emotions, colors, domain relatedness and phonetic properties.
Architecture of BRAINSUP
More formally, the user input is a tuple: U = (t,d, c, e,p,w> , where t is the set of target words, (1 is a set of words defining the target domain, 0 and p are, respectively, the color and the emotion towards which the user wants to slant the sentence, p represents the desired phonetic features, and w is a set of weights that control the influence of each dimension on the generative process , as detailed in Section 3.3.
Architecture of BRAINSUP
The sentence generation process is based on morpho-syntactic patterns which we automatically discover from a corpus of dependency parsed sentences ’P.
Architecture of BRAINSUP
Algorithm 1 provides a high-level description of the creative sentence generation process .
Evaluation
We did so to simulate the brainstorming phase behind the slogan generation process , where copywriters start with a set of relevant keywords to come up with a catchy slogan.
Evaluation
Nonetheless, it is very encouraging to observe that the generation process does not deteriorate the positive impact of the input keywords.
generative process is mentioned in 9 sentences in this paper.
Topics mentioned in this paper:
Mukherjee, Arjun and Liu, Bing
Model
This observation motivates the generative process of our model.
Model
Referring to the notations in Table 1, we explain the generative process of JTE-P.
Model
We now detail the generative process of J TE-P (plate notation in Figure l) as follows:
Phrase Ranking based on Relevance
This thread of research models bigrams by encoding them into the generative process .
Related Work
The generative process of ASMs is, however, different from our model.
generative process is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Lei, Tao and Long, Fan and Barzilay, Regina and Rinard, Martin
Introduction
We model our problem as a joint dependency parsing and role labeling task, assuming a Bayesian generative process .
Model
Modeling the Generative Process .
Model
The generative process is described formally as follows:
Related Work
While previous approaches rely on the feedback to train a discriminative prediction model, our approach models a generative process to guide structure predictions when the feedback is noisy or unavailable.
generative process is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Börschinger, Benjamin and Johnson, Mark and Demuth, Katherine
Discussion
Using a more realistic generative process for the underlying forms, for example an Adaptor Grammar (Johnson et al., 2007), could address this shortcoming in future work without changing the overall architecture of the model although novel inference algorithms might be required.
The computational model
This generative process is repeated for each utterance 2', leading to multiple utterances of the form Um, .
The computational model
The generative process mimics the intuitively plausible idea of generating underlying forms from some kind of syntactic model (here, a Bigram language model) and then mapping the underlying form to an observed surface-form through the application of a phonological rule component, here represented by the collection of rule probabilities pc.
generative process is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Cohn, Trevor and Haffari, Gholamreza
Model
The generative process of the model follows that of ITG with the following simple grammar
Model
The generative process is that we draw a complete ITG tree, 75 N P2 as follows:
Model
depending on 7“ This generative process is mutually recursive: P2 makes draws from P1 and P1 makes draws from P2.
generative process is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Feng, Yang and Cohn, Trevor
Experiments
Decoding under our model would be straightforward in principle, as the generative process was designed to closely parallel the search procedure in the phrase-based model.3 Three data sets were used in the experiments: two Chinese to English data sets on small (IWSLT) and larger corpora (FBIS), and Arabic
Model
The generative process employs the following recursive procedure to construct the target sentence conditioned on the source:
Model
This generative process resembles the sequence of translation decisions considered by a standard MT decoder (Koehn et al., 2003), but note that our approach differs in that there is no constraint that all words are translated exactly once.
generative process is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Ravi, Sujith
Decipherment Model for Machine Translation
So, instead we use a simplified generative process for the translation model as proposed by Ravi and Knight (2011b) and used by others (Nuhn et al., 2012) for this task:
Decipherment Model for Machine Translation
We now extend the generative process (described earlier) to more complex translation models.
Decipherment Model for Machine Translation
Nonlocal Reordering: The generative process described earlier limits reordering to local or adjacent word pairs in a source sentence.
generative process is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: