Adaptive Recursive Neural Network for Target-dependent Twitter Sentiment Classification
Dong, Li and Wei, Furu and Tan, Chuanqi and Tang, Duyu and Zhou, Ming and Xu, Ke

Article Structure

Abstract

We propose Adaptive Recursive Neural Network (AdaRNN) for target-dependent Twitter sentiment classification.

Introduction

Twitter becomes one of the most popular social networking sites, which allows the users to read and post messages (i.e.

RNN: Recursive Neural Network

RNN (Socher et al., 2011) represents the phrases and words as D-dimensional vectors.

Our Approach

We use the dependency parsing results to find the words syntactically connected with the interested target.

Experiments

As people tend to post comments for the celebrities, products, and companies, we use these keywords (such as “bill gates”, “taylor swift”, “xbox”, “windows 7”, “g00gle”) to query the Twitter API.

Conclusion

We propose Adaptive Recursive Neural Network (AdaRNN) for the target-dependent Twitter sentiment classification.

Topics

Recursive

Appears in 14 sentences as: Recursive (8) recursive (6)
In Adaptive Recursive Neural Network for Target-dependent Twitter Sentiment Classification
  1. We propose Adaptive Recursive Neural Network (AdaRNN) for target-dependent Twitter sentiment classification.
    Page 1, “Abstract”
  2. In this paper, we mainly focus on integrating target information with Recursive Neural Network (RNN) to leverage the ability of deep learning models.
    Page 1, “Introduction”
  3. RNN utilizes the recursive structure of text, and it has achieved state-of-the-art sentiment analysis results for movie review dataset (Socher et al., 2012; Socher et al., 2013).
    Page 1, “Introduction”
  4. The recursive neural models employ the semantic composition functions, which enables them to handle the complex com-positionalities in sentiment analysis.
    Page 1, “Introduction”
  5. We employ a novel adaptive multi-compositionality layer in recursive neural network, which is named as AdaRNN (Dong et al., 2014).
    Page 1, “Introduction”
  6. Figure l: The composition process for “not very good” in Recursive Neural Network.
    Page 2, “RNN: Recursive Neural Network”
  7. Adaptive Recursive Neural Network is proposed to propagate the sentiments of words to the target node.
    Page 2, “Our Approach”
  8. In Section 3.1, we show how to build recursive structure for target using the dependency parsing results.
    Page 2, “Our Approach”
  9. In Section 3.2, we propose Adaptive Recursive Neural Network and use it for target-dependent sentiment analysis.
    Page 2, “Our Approach”
  10. 3.1 Build Recursive Structure
    Page 2, “Our Approach”
  11. 3.2 AdaRNN: Adaptive Recursive Neural Network
    Page 3, “Our Approach”

See all papers in Proc. ACL 2014 that mention Recursive.

See all papers in Proc. ACL that mention Recursive.

Back to top.

Recursive Neural

Appears in 11 sentences as: Recursive Neural (7) recursive neural (4)
In Adaptive Recursive Neural Network for Target-dependent Twitter Sentiment Classification
  1. We propose Adaptive Recursive Neural Network (AdaRNN) for target-dependent Twitter sentiment classification.
    Page 1, “Abstract”
  2. In this paper, we mainly focus on integrating target information with Recursive Neural Network (RNN) to leverage the ability of deep learning models.
    Page 1, “Introduction”
  3. The recursive neural models employ the semantic composition functions, which enables them to handle the complex com-positionalities in sentiment analysis.
    Page 1, “Introduction”
  4. We employ a novel adaptive multi-compositionality layer in recursive neural network, which is named as AdaRNN (Dong et al., 2014).
    Page 1, “Introduction”
  5. Figure l: The composition process for “not very good” in Recursive Neural Network.
    Page 2, “RNN: Recursive Neural Network”
  6. Adaptive Recursive Neural Network is proposed to propagate the sentiments of words to the target node.
    Page 2, “Our Approach”
  7. In Section 3.2, we propose Adaptive Recursive Neural Network and use it for target-dependent sentiment analysis.
    Page 2, “Our Approach”
  8. 3.2 AdaRNN: Adaptive Recursive Neural Network
    Page 3, “Our Approach”
  9. For recursive neural models, the dimension of word vector is set to 25, and f = tanh is used as the nonlinearity function.
    Page 4, “Experiments”
  10. AdaRNN provides more powerful composition ability, so that it achieves better semantic composition for recursive neural models.
    Page 5, “Experiments”
  11. We propose Adaptive Recursive Neural Network (AdaRNN) for the target-dependent Twitter sentiment classification.
    Page 5, “Conclusion”

See all papers in Proc. ACL 2014 that mention Recursive Neural.

See all papers in Proc. ACL that mention Recursive Neural.

Back to top.

dependency tree

Appears in 8 sentences as: Dependency Tree (1) Dependency tree (1) dependency tree (7)
In Adaptive Recursive Neural Network for Target-dependent Twitter Sentiment Classification
  1. The dependency tree indicates the dependency relations between words.
    Page 2, “Our Approach”
  2. Algorithm 1 Convert Dependency Tree Input: Target node, Dependency tree Output: Converted tree 1: function CONV(7“) 2: E. <— SORT(dep edges connected with 7“) v <— 7“ for (7“ A 77/77 A 7“) in Er do if 7“ is head of u then 712 <— node with CONV(u), v as children else 712 <— node with v, CONV(u) as children
    Page 2, “Our Approach”
  3. As illustrated in the Algorithm 1, we recursively convert the dependency tree starting from the target node.
    Page 2, “Our Approach”
  4. We use two rules to determine the order of combinations: (1) the words whose head is the target in dependency tree are first combined, and then the rest of connected words are combined; (2) if the first rule cannot determine the order, the connected words are sorted by their positions in sentence from right to left.
    Page 3, “Our Approach”
  5. SVM-conn: The words, punctuations, emoti-cons, and #hashtags included in the converted dependency tree are used as the features for SVM.
    Page 4, “Experiments”
  6. RNN: It is performed on the converted dependency tree Without adaptive composition selection.
    Page 4, “Experiments”
  7. RNN is also based on the converted dependency tree .
    Page 5, “Experiments”
  8. For a given tweet, we first convert its dependency tree for the interested target.
    Page 5, “Conclusion”

See all papers in Proc. ACL 2014 that mention dependency tree.

See all papers in Proc. ACL that mention dependency tree.

Back to top.

Neural Network

Appears in 8 sentences as: Neural Network (7) neural network (1)
In Adaptive Recursive Neural Network for Target-dependent Twitter Sentiment Classification
  1. We propose Adaptive Recursive Neural Network (AdaRNN) for target-dependent Twitter sentiment classification.
    Page 1, “Abstract”
  2. In this paper, we mainly focus on integrating target information with Recursive Neural Network (RNN) to leverage the ability of deep learning models.
    Page 1, “Introduction”
  3. We employ a novel adaptive multi-compositionality layer in recursive neural network , which is named as AdaRNN (Dong et al., 2014).
    Page 1, “Introduction”
  4. Figure l: The composition process for “not very good” in Recursive Neural Network .
    Page 2, “RNN: Recursive Neural Network”
  5. Adaptive Recursive Neural Network is proposed to propagate the sentiments of words to the target node.
    Page 2, “Our Approach”
  6. In Section 3.2, we propose Adaptive Recursive Neural Network and use it for target-dependent sentiment analysis.
    Page 2, “Our Approach”
  7. 3.2 AdaRNN: Adaptive Recursive Neural Network
    Page 3, “Our Approach”
  8. We propose Adaptive Recursive Neural Network (AdaRNN) for the target-dependent Twitter sentiment classification.
    Page 5, “Conclusion”

See all papers in Proc. ACL 2014 that mention Neural Network.

See all papers in Proc. ACL that mention Neural Network.

Back to top.

Recursive Neural Network

Appears in 8 sentences as: Recursive Neural Network (7) recursive neural network (1)
In Adaptive Recursive Neural Network for Target-dependent Twitter Sentiment Classification
  1. We propose Adaptive Recursive Neural Network (AdaRNN) for target-dependent Twitter sentiment classification.
    Page 1, “Abstract”
  2. In this paper, we mainly focus on integrating target information with Recursive Neural Network (RNN) to leverage the ability of deep learning models.
    Page 1, “Introduction”
  3. We employ a novel adaptive multi-compositionality layer in recursive neural network , which is named as AdaRNN (Dong et al., 2014).
    Page 1, “Introduction”
  4. Figure l: The composition process for “not very good” in Recursive Neural Network .
    Page 2, “RNN: Recursive Neural Network”
  5. Adaptive Recursive Neural Network is proposed to propagate the sentiments of words to the target node.
    Page 2, “Our Approach”
  6. In Section 3.2, we propose Adaptive Recursive Neural Network and use it for target-dependent sentiment analysis.
    Page 2, “Our Approach”
  7. 3.2 AdaRNN: Adaptive Recursive Neural Network
    Page 3, “Our Approach”
  8. We propose Adaptive Recursive Neural Network (AdaRNN) for the target-dependent Twitter sentiment classification.
    Page 5, “Conclusion”

See all papers in Proc. ACL 2014 that mention Recursive Neural Network.

See all papers in Proc. ACL that mention Recursive Neural Network.

Back to top.

sentiment classification

Appears in 6 sentences as: sentiment classification (6)
In Adaptive Recursive Neural Network for Target-dependent Twitter Sentiment Classification
  1. We propose Adaptive Recursive Neural Network (AdaRNN) for target-dependent Twitter sentiment classification .
    Page 1, “Abstract”
  2. For target-dependent sentiment classification , the manual evaluation of Jiang et al.
    Page 1, “Introduction”
  3. The neural models use distributed representation (Hinton, 1986; Rumelhart et al., 1986; Bengio et al., 2003) to automatically learn features for target-dependent sentiment classification .
    Page 1, “Introduction”
  4. To the best of our knowledge, this is the largest target-dependent Twitter sentiment classification dataset which is annotated manually.
    Page 4, “Experiments”
  5. Table 1: Evaluation results on target-dependent Twitter sentiment classification dataset.
    Page 4, “Experiments”
  6. We propose Adaptive Recursive Neural Network (AdaRNN) for the target-dependent Twitter sentiment classification .
    Page 5, “Conclusion”

See all papers in Proc. ACL 2014 that mention sentiment classification.

See all papers in Proc. ACL that mention sentiment classification.

Back to top.

dependency parsing

Appears in 5 sentences as: dependency parsing (5)
In Adaptive Recursive Neural Network for Target-dependent Twitter Sentiment Classification
  1. (2011) combine the target-independent features (content and lexicon) and target-dependent features (rules based on the dependency parsing results) together in subjectivity classification and polarity classification for tweets.
    Page 1, “Introduction”
  2. We use the dependency parsing results to find the words syntactically connected with the interested target.
    Page 2, “Our Approach”
  3. In Section 3.1, we show how to build recursive structure for target using the dependency parsing results.
    Page 2, “Our Approach”
  4. A tweet-specific tokenizer (Gimpel et al., 2011) is employed, and the dependency parsing results are computed by Stanford Parser (Klein and Manning, 2003).
    Page 4, “Experiments”
  5. The POS tagging and dependency parsing results are not precise enough for the Twitter data, so these handcrafted rules are rarely matched.
    Page 5, “Experiments”

See all papers in Proc. ACL 2014 that mention dependency parsing.

See all papers in Proc. ACL that mention dependency parsing.

Back to top.

sentiment analysis

Appears in 5 sentences as: sentiment analysis (5)
In Adaptive Recursive Neural Network for Target-dependent Twitter Sentiment Classification
  1. Furthermore, we introduce a manually annotated dataset for target-dependent Twitter sentiment analysis .
    Page 1, “Abstract”
  2. RNN utilizes the recursive structure of text, and it has achieved state-of-the-art sentiment analysis results for movie review dataset (Socher et al., 2012; Socher et al., 2013).
    Page 1, “Introduction”
  3. The recursive neural models employ the semantic composition functions, which enables them to handle the complex com-positionalities in sentiment analysis .
    Page 1, “Introduction”
  4. determines how to propagate the sentiments towards the target and handles the negation or intensification phenomena (Taboada et al., 2011) in sentiment analysis .
    Page 2, “Introduction”
  5. In Section 3.2, we propose Adaptive Recursive Neural Network and use it for target-dependent sentiment analysis .
    Page 2, “Our Approach”

See all papers in Proc. ACL 2014 that mention sentiment analysis.

See all papers in Proc. ACL that mention sentiment analysis.

Back to top.

vector representation

Appears in 5 sentences as: vector representation (3) vector representations (2)
In Adaptive Recursive Neural Network for Target-dependent Twitter Sentiment Classification
  1. It performs compositions based on the binary trees, and obtain the vector representations in a bottom-up way.
    Page 2, “RNN: Recursive Neural Network”
  2. The vector representation v is obtained via:
    Page 2, “RNN: Recursive Neural Network”
  3. The vector representation of root node is then fed into a softmax classifier to predict the label.
    Page 2, “RNN: Recursive Neural Network”
  4. Specifically, the predicted distribution is y = softmax (UV), where y is the predicted distribution, U E RKXD is the classification matrix, and v is the vector representation of node.
    Page 2, “RNN: Recursive Neural Network”
  5. The computation process is conducted in a bottom-up manner, and the vector representations are computed recursively.
    Page 2, “Our Approach”

See all papers in Proc. ACL 2014 that mention vector representation.

See all papers in Proc. ACL that mention vector representation.

Back to top.

dependency relation

Appears in 3 sentences as: dependency relation (2) dependency relations (1)
In Adaptive Recursive Neural Network for Target-dependent Twitter Sentiment Classification
  1. The dependency tree indicates the dependency relations between words.
    Page 2, “Our Approach”
  2. The dependency relation types are remained to guide the sentiment propagations in our model.
    Page 2, “Our Approach”
  3. Notably, the conversion is performed recursively for the connected words and the dependency relation types are remained.
    Page 3, “Our Approach”

See all papers in Proc. ACL 2014 that mention dependency relation.

See all papers in Proc. ACL that mention dependency relation.

Back to top.

feature vector

Appears in 3 sentences as: feature vector (3)
In Adaptive Recursive Neural Network for Target-dependent Twitter Sentiment Classification
  1. ,gc are the composition functions, P (9;, |vl, VT, e) is the probability of employing 9;, given the child vectors vl, VT and external feature vector e, and f is the nonlinearity function.
    Page 3, “Our Approach”
  2. where 6 is the hyper-parameter, S E ROXQDHGI) is the matrix used to determine which composition function we use, vl, VT are the left and right child vectors, and e are external feature vector .
    Page 3, “Our Approach”
  3. In this work, e is a one-hot binary feature vector which indicates what the dependency type is.
    Page 3, “Our Approach”

See all papers in Proc. ACL 2014 that mention feature vector.

See all papers in Proc. ACL that mention feature vector.

Back to top.

manually annotated

Appears in 3 sentences as: manually annotate (1) manually annotated (2)
In Adaptive Recursive Neural Network for Target-dependent Twitter Sentiment Classification
  1. Furthermore, we introduce a manually annotated dataset for target-dependent Twitter sentiment analysis.
    Page 1, “Abstract”
  2. In addition, we introduce a manually annotated dataset, and conduct extensive experiments on it.
    Page 2, “Introduction”
  3. After obtaining the tweets, we manually annotate the sentiment labels (negative, neutral, positive) for these targets.
    Page 3, “Experiments”

See all papers in Proc. ACL 2014 that mention manually annotated.

See all papers in Proc. ACL that mention manually annotated.

Back to top.

SVM

Appears in 3 sentences as: SVM (3)
In Adaptive Recursive Neural Network for Target-dependent Twitter Sentiment Classification
  1. the target-independent (SVM-indep) and target-dependent features and uses SVM as the classifier.
    Page 4, “Experiments”
  2. SVM-conn: The words, punctuations, emoti-cons, and #hashtags included in the converted dependency tree are used as the features for SVM .
    Page 4, “Experiments”
  3. AdaRNN-comb: We combine the root vectors obtained by AdaRNN-Wfli with the uni/bi-gram features, and they are fed into a SVM classifier.
    Page 4, “Experiments”

See all papers in Proc. ACL 2014 that mention SVM.

See all papers in Proc. ACL that mention SVM.

Back to top.