Index of papers in Proc. ACL 2014 that mention
  • recursive
Dong, Li and Wei, Furu and Tan, Chuanqi and Tang, Duyu and Zhou, Ming and Xu, Ke
Abstract
We propose Adaptive Recursive Neural Network (AdaRNN) for target-dependent Twitter sentiment classification.
Introduction
In this paper, we mainly focus on integrating target information with Recursive Neural Network (RNN) to leverage the ability of deep learning models.
Introduction
RNN utilizes the recursive structure of text, and it has achieved state-of-the-art sentiment analysis results for movie review dataset (Socher et al., 2012; Socher et al., 2013).
Introduction
The recursive neural models employ the semantic composition functions, which enables them to handle the complex com-positionalities in sentiment analysis.
Our Approach
Adaptive Recursive Neural Network is proposed to propagate the sentiments of words to the target node.
Our Approach
In Section 3.1, we show how to build recursive structure for target using the dependency parsing results.
Our Approach
In Section 3.2, we propose Adaptive Recursive Neural Network and use it for target-dependent sentiment analysis.
RNN: Recursive Neural Network
Figure l: The composition process for “not very good” in Recursive Neural Network.
recursive is mentioned in 14 sentences in this paper.
Topics mentioned in this paper:
Liu, Shujie and Yang, Nan and Li, Mu and Zhou, Ming
Abstract
In this paper, we propose a novel recursive recurrent neural network (RZNN) to model the end-to-end decoding process for statistical machine translation.
Abstract
RZNN is a combination of recursive neural network and recurrent neural network, and in turn integrates their respective capabilities: (1) new information can be used to generate the next hidden state, like recurrent neural networks, so that language model and translation model can be integrated naturally; (2) a tree structure can be built, as recursive neural networks, so as to generate the translation candidates in a bottom up manner.
Introduction
Recursive neural networks, which have the ability to generate a tree structured output, are applied to natural language parsing (Socher et al., 2011), and they are extended to recursive neural tensor networks to explore the compositional aspect of semantics (Socher et al., 2013).
Introduction
(2013) use recursive auto encoders to make full use of the entire merging phrase pairs, going beyond the boundary words with a maximum entropy classifier (Xiong et al., 2006).
Introduction
RZNN is a combination of recursive neural network and recurrent neural network.
Related Work
(2013) propose to apply recursive auto-encoder to make full use of the entire merged blocks.
recursive is mentioned in 37 sentences in this paper.
Topics mentioned in this paper:
Zhang, Jiajun and Liu, Shujie and Li, Mu and Zhou, Ming and Zong, Chengqing
Abstract
We propose Bilingually-constrained Recursive Auto-encoders (BRAE) to learn semantic phrase embeddings (compact vector representations for phrases), which can distinguish the phrases with different semantic meanings.
Bilingually-constrained Recursive Auto-encoders
This section introduces the Bilingually-constrained Recursive Auto-encoders (BRAE), that is inspired by two observations.
Bilingually-constrained Recursive Auto-encoders
First, the recursive auto-encoder provides a reasonable composition mechanism to embed each phrase.
Bilingually-constrained Recursive Auto-encoders
Figure 2: A recursive auto-encoder for a four-word phrase.
Introduction
of its internal words, we propose Bilingually-constrained Recursive Auto-encoders (BRAE) to learn semantic phrase embeddings.
Introduction
In our method, the standard recursive auto-encoder (RAE) pre-trains the phrase embedding with an unsupervised algorithm by minimizing the reconstruction error (Socher et al., 2010), while the bilingually-constrained model learns to fine-tune the phrase embedding by minimizing the semantic distance between translation equivalents and maximizing the semantic distance between non-translation pairs.
Related Work
Instead, our bilingually-constrained recursive auto-encoders not only learn the composition mechanism of generating phrases from words, but also fine tune the word embeddings during the model training stage, so that we can induce the full information of the phrases and internal words.
Related Work
The recursive auto-encoder is typically adopted to learn the way of composition (Socher et al., 2010; Socher et al., 2011; Socher et al., 2013a; Socher et al., 2013b; Li et al., 2013).
recursive is mentioned in 13 sentences in this paper.
Topics mentioned in this paper:
Zhu, Xiaodan and Guo, Hongyu and Mohammad, Saif and Kiritchenko, Svetlana
Abstract
This model learns the syntax and semantics of the negator’s argument with a recursive neural network.
Conclusions
We further make the models to be dependent on the text being modified by negators, through adaptation of a state-of-the-art recursive neural network to incorporate the syntax and semantics of the arguments; we discover this further reduces fitting errors.
Experimental results
Furthermore, modeling the syntax and semantics with the state-of-the-art recursive neural network (model 7 and 8) can dramatically improve the performance over model 6.
Introduction
This model learns the syntax and semantics of the negator’s argument with a recursive neural network.
Related work
The more recent work of (Socher et al., 2012; Socher et al., 2013) proposed models based on recursive neural networks that do not rely on any heuristic rules.
Semantics-enriched modeling
For the former, we adopt the recursive neural tensor network (RNTN) proposed recently by Socher et al.
Semantics-enriched modeling
4.1 RNTN: Recursive neural tensor network
Semantics-enriched modeling
A recursive neural tensor network (RNTN) is a specific form of feed-forward neural network based on syntactic (phrasal-structure) parse tree to conduct compositional sentiment analysis.
recursive is mentioned in 10 sentences in this paper.
Topics mentioned in this paper:
Kalchbrenner, Nal and Grefenstette, Edward and Blunsom, Phil
Background
A model that adopts a more general structure provided by an external parse tree is the Recursive Neural Network (RecNN) (Pollack, 1990; Kiichler and Goller, 1996; Socher et al., 2011; Hermann and Blunsom, 2013).
Background
The Recurrent Neural Network (RNN) is a special case of the recursive network where the structure that is followed is a simple linear chain (Gers and Schmidhuber, 2001; Mikolov et al., 2011).
Experiments
RECNTN is a recursive neural network with a tensor-based feature function, which relies on external structural features given by a parse tree and performs best among the RecNNs.
Introduction
These range from basic neural bag-of-words or bag-of-n-grams models to the more structured recursive neural networks and to time-delay neural networks based on convolutional operations (Collobert and Weston, 2008; Socher et al., 2011; Kalchbrenner and Blunsom, 2013b).
Properties of the Sentence Model
Similarly, a recursive neural network is sensitive to word order but has a bias towards the topmost nodes in the tree; shallower trees mitigate this effect to some extent (Socher et al., 2013a).
Properties of the Sentence Model
The recursive neural network follows the structure of an external parse tree.
recursive is mentioned in 6 sentences in this paper.
Topics mentioned in this paper:
Kulick, Seth and Bies, Ann and Mott, Justin and Kroch, Anthony and Santorini, Beatrice and Liberman, Mark
Analysis of parsing results
attachment score does not apply to the recursive categories, as mentioned above.
Framework for analyzing parsing performance
As described above, we are also interested in the type of linguistic construction represented by that one-level structure, each of which instantiates one of a few types - recursive coordination, simple head-and-sister, etc.
Framework for analyzing parsing performance
(c) NP-modr is a regex for a recursive NP with a right modifier.
Framework for analyzing parsing performance
(d) VP-crd is also a regex for a recursive structure, in this case for VP coordination, picking out the leftmost conjunct as the head of the structure.
recursive is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Dinu, Georgiana and Baroni, Marco
General framework
longer phrases is handled by recursive extension of the two-word case.
General framework
3.3 Recursive (de)composition
Noun phrase generation
5.2 Recursive decomposition
Noun phrase generation
We continue by testing generation through recursive decomposition on the task of generating noun-preposition-noun (NPN) paraphrases of adjective-nouns (AN) phrases.
recursive is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Iyyer, Mohit and Enns, Peter and Boyd-Graber, Jordan and Resnik, Philip
Abstract
Taking inspiration from recent work in sentiment analysis that successfully models the compositional aspect of language, we apply a recursive neural network (RNN) framework to the task of identifying the political position evinced by a sentence.
Conclusion
In this paper we apply recursive neural networks to political ideology detection, a problem where previous work relies heavily on bag-of-words models and hand-designed lexica.
Introduction
Building from those insights, we introduce a recursive neural network (RNN) to detect ideological bias on the sentence level.
Recursive Neural Networks
Recursive neural networks (RNNs) are machine learning models that capture syntactic and semantic composition.
recursive is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Tang, Duyu and Wei, Furu and Yang, Nan and Zhou, Ming and Liu, Ting and Qin, Bing
Related Work
propose Recursive Neural Network (RNN) (2011b), matrix-vector RNN (2012) and Recursive Neural Tensor Network (RNTN) (2013b) to learn the compositionality of phrases of any length based on the representation of each pair of children recursively.
Related Work
(2013) present Combinatory Cate-gorial Autoencoders to learn the compositionality of sentence, which marries the Combinatory Cat-egorial Grammar with Recursive Autoencoder.
Related Work
(4) RAE: Recursive Autoencoder (Socher et al., 2011c) has been proven effective in many sentiment analysis tasks by learning compositionality automatically.
recursive is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: