Index of papers in Proc. ACL that mention
  • recursive
Kozareva, Zornitsa and Hovy, Eduard
Abstract
We propose a minimally supervised bootstrapping algorithm that uses a single seed and a recursive lexico-syntactic pattern to learn the arguments and the supertypes of a diverse set of semantic relations from the Web.
Introduction
Given these considerations, we address in this paper the following question: How can the selec-ti0nal restrictions of semantic relations be learned automatically from the Web with minimal eflort using lexico-syntactic recursive patterns?
Introduction
0 A novel representation of semantic relations using recursive lexico-syntactic patterns.
Introduction
Section 3 addresses the representation of semantic relations using recursive patterns.
Recursive Patterns
Learned terms can then be replaced into the seed position automatically, creating a recursive procedure that is reportedly much more accurate and has much higher final yield.
Recursive Patterns
No other study has described the use or effect of recursive patterns for different semantic relations.
Recursive Patterns
Therefore, going beyond (Kozareva et al., 2008; Hovy et al., 2009), we here introduce recursive patterns other than DAP that use only one seed to harvest the arguments and supertypes of a wide variety of relations.
recursive is mentioned in 25 sentences in this paper.
Topics mentioned in this paper:
Zhang, Jiajun and Liu, Shujie and Li, Mu and Zhou, Ming and Zong, Chengqing
Abstract
We propose Bilingually-constrained Recursive Auto-encoders (BRAE) to learn semantic phrase embeddings (compact vector representations for phrases), which can distinguish the phrases with different semantic meanings.
Bilingually-constrained Recursive Auto-encoders
This section introduces the Bilingually-constrained Recursive Auto-encoders (BRAE), that is inspired by two observations.
Bilingually-constrained Recursive Auto-encoders
First, the recursive auto-encoder provides a reasonable composition mechanism to embed each phrase.
Bilingually-constrained Recursive Auto-encoders
Figure 2: A recursive auto-encoder for a four-word phrase.
Introduction
of its internal words, we propose Bilingually-constrained Recursive Auto-encoders (BRAE) to learn semantic phrase embeddings.
Introduction
In our method, the standard recursive auto-encoder (RAE) pre-trains the phrase embedding with an unsupervised algorithm by minimizing the reconstruction error (Socher et al., 2010), while the bilingually-constrained model learns to fine-tune the phrase embedding by minimizing the semantic distance between translation equivalents and maximizing the semantic distance between non-translation pairs.
Related Work
Instead, our bilingually-constrained recursive auto-encoders not only learn the composition mechanism of generating phrases from words, but also fine tune the word embeddings during the model training stage, so that we can induce the full information of the phrases and internal words.
Related Work
The recursive auto-encoder is typically adopted to learn the way of composition (Socher et al., 2010; Socher et al., 2011; Socher et al., 2013a; Socher et al., 2013b; Li et al., 2013).
recursive is mentioned in 13 sentences in this paper.
Topics mentioned in this paper:
Liu, Shujie and Yang, Nan and Li, Mu and Zhou, Ming
Abstract
In this paper, we propose a novel recursive recurrent neural network (RZNN) to model the end-to-end decoding process for statistical machine translation.
Abstract
RZNN is a combination of recursive neural network and recurrent neural network, and in turn integrates their respective capabilities: (1) new information can be used to generate the next hidden state, like recurrent neural networks, so that language model and translation model can be integrated naturally; (2) a tree structure can be built, as recursive neural networks, so as to generate the translation candidates in a bottom up manner.
Introduction
Recursive neural networks, which have the ability to generate a tree structured output, are applied to natural language parsing (Socher et al., 2011), and they are extended to recursive neural tensor networks to explore the compositional aspect of semantics (Socher et al., 2013).
Introduction
(2013) use recursive auto encoders to make full use of the entire merging phrase pairs, going beyond the boundary words with a maximum entropy classifier (Xiong et al., 2006).
Introduction
RZNN is a combination of recursive neural network and recurrent neural network.
Related Work
(2013) propose to apply recursive auto-encoder to make full use of the entire merged blocks.
recursive is mentioned in 37 sentences in this paper.
Topics mentioned in this paper:
Dong, Li and Wei, Furu and Tan, Chuanqi and Tang, Duyu and Zhou, Ming and Xu, Ke
Abstract
We propose Adaptive Recursive Neural Network (AdaRNN) for target-dependent Twitter sentiment classification.
Introduction
In this paper, we mainly focus on integrating target information with Recursive Neural Network (RNN) to leverage the ability of deep learning models.
Introduction
RNN utilizes the recursive structure of text, and it has achieved state-of-the-art sentiment analysis results for movie review dataset (Socher et al., 2012; Socher et al., 2013).
Introduction
The recursive neural models employ the semantic composition functions, which enables them to handle the complex com-positionalities in sentiment analysis.
Our Approach
Adaptive Recursive Neural Network is proposed to propagate the sentiments of words to the target node.
Our Approach
In Section 3.1, we show how to build recursive structure for target using the dependency parsing results.
Our Approach
In Section 3.2, we propose Adaptive Recursive Neural Network and use it for target-dependent sentiment analysis.
RNN: Recursive Neural Network
Figure l: The composition process for “not very good” in Recursive Neural Network.
recursive is mentioned in 14 sentences in this paper.
Topics mentioned in this paper:
Socher, Richard and Bauer, John and Manning, Christopher D. and Andrew Y., Ng
Abstract
Instead, we introduce a Compositional Vector Grammar (CVG), which combines PCFGs with a syntactically untied recursive neural network that learns syntactico-semantic, compositional vector representations.
Introduction
The vectors for nonterminals are computed via a new type of recursive neural network which is conditioned on syntactic categories from a PCFG.
Introduction
l. CVGs combine the advantages of standard probabilistic context free grammars (PCFG) with those of recursive neural networks (RNNs).
Introduction
sets of discrete states and recursive deep learning models that jointly learn classifiers and continuous feature representations for variable-sized inputs.
recursive is mentioned in 11 sentences in this paper.
Topics mentioned in this paper:
Zhu, Xiaodan and Guo, Hongyu and Mohammad, Saif and Kiritchenko, Svetlana
Abstract
This model learns the syntax and semantics of the negator’s argument with a recursive neural network.
Conclusions
We further make the models to be dependent on the text being modified by negators, through adaptation of a state-of-the-art recursive neural network to incorporate the syntax and semantics of the arguments; we discover this further reduces fitting errors.
Experimental results
Furthermore, modeling the syntax and semantics with the state-of-the-art recursive neural network (model 7 and 8) can dramatically improve the performance over model 6.
Introduction
This model learns the syntax and semantics of the negator’s argument with a recursive neural network.
Related work
The more recent work of (Socher et al., 2012; Socher et al., 2013) proposed models based on recursive neural networks that do not rely on any heuristic rules.
Semantics-enriched modeling
For the former, we adopt the recursive neural tensor network (RNTN) proposed recently by Socher et al.
Semantics-enriched modeling
4.1 RNTN: Recursive neural tensor network
Semantics-enriched modeling
A recursive neural tensor network (RNTN) is a specific form of feed-forward neural network based on syntactic (phrasal-structure) parse tree to conduct compositional sentiment analysis.
recursive is mentioned in 10 sentences in this paper.
Topics mentioned in this paper:
Hermann, Karl Moritz and Blunsom, Phil
Background
Another set of models that have very successfully been applied in this area are recursive autoencoders (Socher et al., 2011a; Socher et al., 2011b), which are discussed in the next section.
Background
2.3 Recursive Autoencoders
Background
9 9 Extending this idea, recursive autoencoders (RAE) allow the modelling of data of variable size.
Experiments
In this paper we have brought a more formal notion of semantic compositionality to vector space models based on recursive autoencoders.
Introduction
We present a novel class of recursive models, the Combinatory Categorial Autoencoders (CCAE), which marry a semantic process provided by a recursive autoencoder with the syntactic representations of the CCG formalism.
Introduction
tions: Can recursive vector space models be reconciled with a more formal notion of compositionality; and is there a role for syntax in guiding semantics in these types of models?
Introduction
In terms of learning complexity and space requirements, our models strike a balance between simpler greedy approaches (Socher et al., 201 lb) and the larger recursive vector-matrix models (Socher et al., 2012b).
Model
The models in this paper combine the power of recursive , vector-based models with the linguistic intuition of the CCG formalism.
recursive is mentioned in 10 sentences in this paper.
Topics mentioned in this paper:
Lavergne, Thomas and Cappé, Olivier and Yvon, François
Conditional Random Fields
and backward recursions
Conditional Random Fields
These recursions require a number of operations that grows quadratically with
Conditional Random Fields
One advantage of the resulting algorithm, termed BCD in the following, is that the update of 6],, only involves carrying out the forward-backward recursions for the set of sequences that contain symbols cc such that at least one {fk(yl,y,$)}(y,y/)EY2 is non null, which can be much smaller than the whole training set.
Introduction
(2009), who use approximations to simplify the forward-backward recursions .
recursive is mentioned in 10 sentences in this paper.
Topics mentioned in this paper:
Kalchbrenner, Nal and Grefenstette, Edward and Blunsom, Phil
Background
A model that adopts a more general structure provided by an external parse tree is the Recursive Neural Network (RecNN) (Pollack, 1990; Kiichler and Goller, 1996; Socher et al., 2011; Hermann and Blunsom, 2013).
Background
The Recurrent Neural Network (RNN) is a special case of the recursive network where the structure that is followed is a simple linear chain (Gers and Schmidhuber, 2001; Mikolov et al., 2011).
Experiments
RECNTN is a recursive neural network with a tensor-based feature function, which relies on external structural features given by a parse tree and performs best among the RecNNs.
Introduction
These range from basic neural bag-of-words or bag-of-n-grams models to the more structured recursive neural networks and to time-delay neural networks based on convolutional operations (Collobert and Weston, 2008; Socher et al., 2011; Kalchbrenner and Blunsom, 2013b).
Properties of the Sentence Model
Similarly, a recursive neural network is sensitive to word order but has a bias towards the topmost nodes in the tree; shallower trees mitigate this effect to some extent (Socher et al., 2013a).
Properties of the Sentence Model
The recursive neural network follows the structure of an external parse tree.
recursive is mentioned in 6 sentences in this paper.
Topics mentioned in this paper:
Mylonakis, Markos and Sima'an, Khalil
Experiments
As an example, while our probabilistic HR-SCFG maintains a separate joint phrase-pair emission distribution per nonterminal, the smoothing features (a) above assess the conditional translation of surface phrases irrespective of any notion of recursive translation structure.
Introduction
As Hiero uses a single nonterminal and concentrates on overcoming translation lexicon sparsity, it barely explores the recursive nature of translation past the lexical level.
Introduction
By advancing from structures which mimic linguistic syntax, to learning linguistically aware latent recursive structures targeting translation, we achieve significant improvements in translation quality for 4 different language pairs in comparison with a strong hierarchical translation baseline.
Joint Translation Model
Figure 2: Recursive Reordering Grammar rule categories; A, B, C non—terminals; oz, fl source and target strings respectively.
Joint Translation Model
structural part and their associated probabilities define a model 19(0) over the latent variable 0 determining the recursive , reordering and phrase-pair segmenting structure of translation, as in Figure 4.
Learning Translation Structure
We aim to induce a recursive translation structure explaining the joint generation of the source and target
recursive is mentioned in 6 sentences in this paper.
Topics mentioned in this paper:
Cohn, Trevor and Haffari, Gholamreza
Abstract
This paper presents a novel method for inducing phrase-based translation units directly from parallel data, which we frame as learning an inverse transduction grammar (ITG) using a recursive Bayesian prior.
Analysis
We have presented a novel method for leam-ing a phrase-based model of translation directly from parallel data which we have framed as leam-ing an inverse transduction grammar (ITG) using a recursive Bayesian prior.
Model
depending on 7“ This generative process is mutually recursive : P2 makes draws from P1 and P1 makes draws from P2.
Model
where the conditioning of the second recursive call to P2 reflects that the counts 71‘ and K _ may be affected by the first draw from P2.
Related Work
Additionally, we have extended the model to allow recursive nesting of adapted non-terminals, such that we end up with an infinitely recursive formulation where the top-level and base distributions are explicitly linked together.
recursive is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Kulick, Seth and Bies, Ann and Mott, Justin and Kroch, Anthony and Santorini, Beatrice and Liberman, Mark
Analysis of parsing results
attachment score does not apply to the recursive categories, as mentioned above.
Framework for analyzing parsing performance
As described above, we are also interested in the type of linguistic construction represented by that one-level structure, each of which instantiates one of a few types - recursive coordination, simple head-and-sister, etc.
Framework for analyzing parsing performance
(c) NP-modr is a regex for a recursive NP with a right modifier.
Framework for analyzing parsing performance
(d) VP-crd is also a regex for a recursive structure, in this case for VP coordination, picking out the leftmost conjunct as the head of the structure.
recursive is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Zhang, Meishan and Zhang, Yue and Che, Wanxiang and Liu, Ting
Conclusions and Future Work
We studied the internal structures of more than 37,382 Chinese words, analyzing their structures as the recursive combinations of characters.
Introduction
(constituent) trees, adding recursive structures of characters for words.
Word Structures and Syntax Trees
Multi-character words can also have recursive syntactic structures.
Word Structures and Syntax Trees
Our annotations are binarized recursive word
recursive is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Dinu, Georgiana and Baroni, Marco
General framework
longer phrases is handled by recursive extension of the two-word case.
General framework
3.3 Recursive (de)composition
Noun phrase generation
5.2 Recursive decomposition
Noun phrase generation
We continue by testing generation through recursive decomposition on the task of generating noun-preposition-noun (NPN) paraphrases of adjective-nouns (AN) phrases.
recursive is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Tan, Ming and Zhou, Wenli and Zheng, Lei and Wang, Shaojun
Training algorithm
each model parameter over sentence Wl in document d in the training corpus D. For the WORD-PREDICTOR and the SEMANTIZER, the number of possible semantic annotation sequences is exponential, we use forward-backward recursive formulas that are similar to those in hidden Markov models to compute the expected counts.
Training algorithm
In M-step, the recursive linear interpolation scheme (Jelinek and Mercer, 1981) is used to obtain a smooth probability estimate for each model component, WORD-PREDICTOR, TAGGER, and CONSTRUCTOR.
Training algorithm
The recursive mixing scheme is the standard one among relative frequency estimates of different orders k = 0, - - - ,n as explained in (Chelba and J elinek, 2000).
recursive is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Iyyer, Mohit and Enns, Peter and Boyd-Graber, Jordan and Resnik, Philip
Abstract
Taking inspiration from recent work in sentiment analysis that successfully models the compositional aspect of language, we apply a recursive neural network (RNN) framework to the task of identifying the political position evinced by a sentence.
Conclusion
In this paper we apply recursive neural networks to political ideology detection, a problem where previous work relies heavily on bag-of-words models and hand-designed lexica.
Introduction
Building from those insights, we introduce a recursive neural network (RNN) to detect ideological bias on the sentence level.
Recursive Neural Networks
Recursive neural networks (RNNs) are machine learning models that capture syntactic and semantic composition.
recursive is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Wick, Michael and Singh, Sameer and McCallum, Andrew
Hierarchical Coreference
This partitioning can be recursive , i.e., each of these sets can be further partitioned, capturing candidate splits for an entity that can facilitate inference.
Hierarchical Coreference
In order to represent our recursive model of coreference, we include two types of factors: pairwise factors wpw that measure compatibility between a child node-record and its parent, and unit-wise factors wrw that measure compatibilities of the node-records themselves.
Introduction
First, the recursive nature of the tree (arbitrary depth and width) allows the model to adapt to different types of data and effectively compress entities of different scales (e.g., entities with more mentions may require a deeper hierarchy to compress).
recursive is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Tang, Duyu and Wei, Furu and Yang, Nan and Zhou, Ming and Liu, Ting and Qin, Bing
Related Work
propose Recursive Neural Network (RNN) (2011b), matrix-vector RNN (2012) and Recursive Neural Tensor Network (RNTN) (2013b) to learn the compositionality of phrases of any length based on the representation of each pair of children recursively.
Related Work
(2013) present Combinatory Cate-gorial Autoencoders to learn the compositionality of sentence, which marries the Combinatory Cat-egorial Grammar with Recursive Autoencoder.
Related Work
(4) RAE: Recursive Autoencoder (Socher et al., 2011c) has been proven effective in many sentiment analysis tasks by learning compositionality automatically.
recursive is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Han, Xianpei and Zhao, Jun
The Structural Semantic Relatedness Measure
This definition is recursive , and the starting point we choose is the semantic relatedness in the edge.
The Structural Semantic Relatedness Measure
Thus our structural semantic relatedness has two components: the neighbor term of the previous recursive phase which captures the graph structure and the semantic relatedness which captures the edge information.
The Structural Semantic Relatedness Measure
Thus, the recursive form of the structural semantic relatedness 5,; between the node i and the node j can be written as:
recursive is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: