Index of papers in Proc. ACL 2013 that mention
  • recursive
Socher, Richard and Bauer, John and Manning, Christopher D. and Andrew Y., Ng
Abstract
Instead, we introduce a Compositional Vector Grammar (CVG), which combines PCFGs with a syntactically untied recursive neural network that learns syntactico-semantic, compositional vector representations.
Introduction
The vectors for nonterminals are computed via a new type of recursive neural network which is conditioned on syntactic categories from a PCFG.
Introduction
l. CVGs combine the advantages of standard probabilistic context free grammars (PCFG) with those of recursive neural networks (RNNs).
Introduction
sets of discrete states and recursive deep learning models that jointly learn classifiers and continuous feature representations for variable-sized inputs.
recursive is mentioned in 11 sentences in this paper.
Topics mentioned in this paper:
Hermann, Karl Moritz and Blunsom, Phil
Background
Another set of models that have very successfully been applied in this area are recursive autoencoders (Socher et al., 2011a; Socher et al., 2011b), which are discussed in the next section.
Background
2.3 Recursive Autoencoders
Background
9 9 Extending this idea, recursive autoencoders (RAE) allow the modelling of data of variable size.
Experiments
In this paper we have brought a more formal notion of semantic compositionality to vector space models based on recursive autoencoders.
Introduction
We present a novel class of recursive models, the Combinatory Categorial Autoencoders (CCAE), which marry a semantic process provided by a recursive autoencoder with the syntactic representations of the CCG formalism.
Introduction
tions: Can recursive vector space models be reconciled with a more formal notion of compositionality; and is there a role for syntax in guiding semantics in these types of models?
Introduction
In terms of learning complexity and space requirements, our models strike a balance between simpler greedy approaches (Socher et al., 201 lb) and the larger recursive vector-matrix models (Socher et al., 2012b).
Model
The models in this paper combine the power of recursive , vector-based models with the linguistic intuition of the CCG formalism.
recursive is mentioned in 10 sentences in this paper.
Topics mentioned in this paper:
Cohn, Trevor and Haffari, Gholamreza
Abstract
This paper presents a novel method for inducing phrase-based translation units directly from parallel data, which we frame as learning an inverse transduction grammar (ITG) using a recursive Bayesian prior.
Analysis
We have presented a novel method for leam-ing a phrase-based model of translation directly from parallel data which we have framed as leam-ing an inverse transduction grammar (ITG) using a recursive Bayesian prior.
Model
depending on 7“ This generative process is mutually recursive : P2 makes draws from P1 and P1 makes draws from P2.
Model
where the conditioning of the second recursive call to P2 reflects that the counts 71‘ and K _ may be affected by the first draw from P2.
Related Work
Additionally, we have extended the model to allow recursive nesting of adapted non-terminals, such that we end up with an infinitely recursive formulation where the top-level and base distributions are explicitly linked together.
recursive is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Zhang, Meishan and Zhang, Yue and Che, Wanxiang and Liu, Ting
Conclusions and Future Work
We studied the internal structures of more than 37,382 Chinese words, analyzing their structures as the recursive combinations of characters.
Introduction
(constituent) trees, adding recursive structures of characters for words.
Word Structures and Syntax Trees
Multi-character words can also have recursive syntactic structures.
Word Structures and Syntax Trees
Our annotations are binarized recursive word
recursive is mentioned in 4 sentences in this paper.
Topics mentioned in this paper: