Index of papers in Proc. ACL that mention
  • recursive neural network
Liu, Shujie and Yang, Nan and Li, Mu and Zhou, Ming
Abstract
RZNN is a combination of recursive neural network and recurrent neural network, and in turn integrates their respective capabilities: (1) new information can be used to generate the next hidden state, like recurrent neural networks, so that language model and translation model can be integrated naturally; (2) a tree structure can be built, as recursive neural networks , so as to generate the translation candidates in a bottom up manner.
Introduction
Recursive neural networks , which have the ability to generate a tree structured output, are applied to natural language parsing (Socher et al., 2011), and they are extended to recursive neural tensor networks to explore the compositional aspect of semantics (Socher et al., 2013).
Introduction
RZNN is a combination of recursive neural network and recurrent neural network.
Introduction
In RZNN, new information can be used to generate the next hidden state, like recurrent neural networks, and a tree structure can be built, as recursive neural networks .
Our Model
RZNN is a combination of recursive neural network and recurrent neural network, which not only integrates the conventional global features as input information for each combination, but also generates the representation of the parent node for the future candidate generation.
Our Model
In this section, we briefly recall the recurrent neural network and recursive neural network in Section 3.1 and 3.2, and then we elaborate our RZNN in detail in Section 3.3.
Our Model
3.2 Recursive Neural Network
recursive neural network is mentioned in 19 sentences in this paper.
Topics mentioned in this paper:
Socher, Richard and Bauer, John and Manning, Christopher D. and Andrew Y., Ng
Abstract
Instead, we introduce a Compositional Vector Grammar (CVG), which combines PCFGs with a syntactically untied recursive neural network that learns syntactico-semantic, compositional vector representations.
Introduction
The vectors for nonterminals are computed via a new type of recursive neural network which is conditioned on syntactic categories from a PCFG.
Introduction
l. CVGs combine the advantages of standard probabilistic context free grammars (PCFG) with those of recursive neural networks (RNNs).
Introduction
(2003) apply recursive neural networks to re-rank possible phrase attachments in an incremental parser.
recursive neural network is mentioned in 8 sentences in this paper.
Topics mentioned in this paper:
Dong, Li and Wei, Furu and Tan, Chuanqi and Tang, Duyu and Zhou, Ming and Xu, Ke
Abstract
We propose Adaptive Recursive Neural Network (AdaRNN) for target-dependent Twitter sentiment classification.
Conclusion
We propose Adaptive Recursive Neural Network (AdaRNN) for the target-dependent Twitter sentiment classification.
Introduction
In this paper, we mainly focus on integrating target information with Recursive Neural Network (RNN) to leverage the ability of deep learning models.
Introduction
We employ a novel adaptive multi-compositionality layer in recursive neural network , which is named as AdaRNN (Dong et al., 2014).
Our Approach
Adaptive Recursive Neural Network is proposed to propagate the sentiments of words to the target node.
Our Approach
In Section 3.2, we propose Adaptive Recursive Neural Network and use it for target-dependent sentiment analysis.
Our Approach
3.2 AdaRNN: Adaptive Recursive Neural Network
RNN: Recursive Neural Network
Figure l: The composition process for “not very good” in Recursive Neural Network .
recursive neural network is mentioned in 8 sentences in this paper.
Topics mentioned in this paper:
Zhu, Xiaodan and Guo, Hongyu and Mohammad, Saif and Kiritchenko, Svetlana
Abstract
This model learns the syntax and semantics of the negator’s argument with a recursive neural network .
Conclusions
We further make the models to be dependent on the text being modified by negators, through adaptation of a state-of-the-art recursive neural network to incorporate the syntax and semantics of the arguments; we discover this further reduces fitting errors.
Experimental results
Furthermore, modeling the syntax and semantics with the state-of-the-art recursive neural network (model 7 and 8) can dramatically improve the performance over model 6.
Introduction
This model learns the syntax and semantics of the negator’s argument with a recursive neural network .
Related work
The more recent work of (Socher et al., 2012; Socher et al., 2013) proposed models based on recursive neural networks that do not rely on any heuristic rules.
Semantics-enriched modeling
A major difference of RNTN from the conventional recursive neural network (RRN) (Socher et al., 2012) is the use of the tensor V in order to directly capture the multiplicative interaction of two input vectors, although the matrix W implicitly captures the nonlinear interaction between the input vectors.
Semantics-enriched modeling
This is actually an interesting place to extend the current recursive neural network to consider extrinsic knowledge.
recursive neural network is mentioned in 7 sentences in this paper.
Topics mentioned in this paper:
Kalchbrenner, Nal and Grefenstette, Edward and Blunsom, Phil
Background
A model that adopts a more general structure provided by an external parse tree is the Recursive Neural Network (RecNN) (Pollack, 1990; Kiichler and Goller, 1996; Socher et al., 2011; Hermann and Blunsom, 2013).
Experiments
RECNTN is a recursive neural network with a tensor-based feature function, which relies on external structural features given by a parse tree and performs best among the RecNNs.
Introduction
These range from basic neural bag-of-words or bag-of-n-grams models to the more structured recursive neural networks and to time-delay neural networks based on convolutional operations (Collobert and Weston, 2008; Socher et al., 2011; Kalchbrenner and Blunsom, 2013b).
Properties of the Sentence Model
Similarly, a recursive neural network is sensitive to word order but has a bias towards the topmost nodes in the tree; shallower trees mitigate this effect to some extent (Socher et al., 2013a).
Properties of the Sentence Model
The recursive neural network follows the structure of an external parse tree.
recursive neural network is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Iyyer, Mohit and Enns, Peter and Boyd-Graber, Jordan and Resnik, Philip
Abstract
Taking inspiration from recent work in sentiment analysis that successfully models the compositional aspect of language, we apply a recursive neural network (RNN) framework to the task of identifying the political position evinced by a sentence.
Conclusion
In this paper we apply recursive neural networks to political ideology detection, a problem where previous work relies heavily on bag-of-words models and hand-designed lexica.
Introduction
Building from those insights, we introduce a recursive neural network (RNN) to detect ideological bias on the sentence level.
Recursive Neural Networks
Recursive neural networks (RNNs) are machine learning models that capture syntactic and semantic composition.
recursive neural network is mentioned in 4 sentences in this paper.
Topics mentioned in this paper: