Extracting Opinion Targets and Opinion Words from Online Reviews with Graph Co-ranking
Liu, Kang and Xu, Liheng and Zhao, Jun

Article Structure

Abstract

Extracting opinion targets and opinion words from online reviews are two fundamental tasks in opinion mining.

Introduction

In opinion mining, extracting opinion targets and opinion words are two fundamental subtasks.

Related Work

There are many significant research efforts on opinion targets/words extraction (sentence level and corpus level).

The Proposed Method

In this section, we propose our method in detail.

Experiments

4.1 Datasets and Evaluation Metrics

Conclusions

This paper presents a novel method with graph co-ranking to co-extract opinion targets/words.

Topics

semantic relations

Appears in 33 sentences as: Semantic Relation (1) semantic relation (2) Semantic Relations (4) semantic relations (28)
In Extracting Opinion Targets and Opinion Words from Online Reviews with Graph Co-ranking
  1. First, compared to previous methods which solely employed opinion relations among words, our method constructs a heterogeneous graph to model two types of relations, including semantic relations and opinion relations.
    Page 1, “Abstract”
  2. We call such relations between homogeneous words as semantic relations .
    Page 1, “Introduction”
  3. Intuitively, besides opinion relations, semantic relations may provide additional rich clues for indicating opinion targets/words.
    Page 1, “Introduction”
  4. Solid curves and dotted lines respectively mean semantic relations and opinion relations between two candidates.
    Page 2, “Introduction”
  5. First, we operate over a heterogeneous graph to model semantic relations and opinion relations into a unified model.
    Page 2, “Introduction”
  6. The first subgraph G“ represents semantic relations among opinion target candidates, and the second subgraph G00 models semantic relations among opinion word candidates.
    Page 2, “Introduction”
  7. However, all aforementioned methods only employed opinion relations for the extraction, but ignore considering semantic relations among homogeneous candidates.
    Page 3, “Related Work”
  8. In terms of considering semantic relations among words, our method is related with several approaches based on topic model (Zhao et al., 2010; Moghaddam and Ester, 2011; Moghaddam and Ester, 2012a; Moghaddam and Ester, 2012b; Mukherjee and Liu, 2012).
    Page 3, “Related Work”
  9. Although these models could be used for our task according to the associations between candidates and topics, solely employing semantic relations is still one-sided and insufficient to obtain expected performance.
    Page 3, “Related Work”
  10. Different from traditional methods, besides opinion relations among words, we additionally capture semantic relations among homogeneous candidates.
    Page 3, “The Proposed Method”
  11. E” C E represents the semantic relations between two opinion target candidates.
    Page 3, “The Proposed Method”

See all papers in Proc. ACL 2014 that mention semantic relations.

See all papers in Proc. ACL that mention semantic relations.

Back to top.

co-occurrence

Appears in 6 sentences as: co-occurrence (6)
In Extracting Opinion Targets and Opinion Words from Online Reviews with Graph Co-ranking
  1. In traditional extraction strategy, opinion associations are usually computed based on the co-occurrence frequency.
    Page 2, “Introduction”
  2. They usually captured different relations using co-occurrence information.
    Page 3, “Related Work”
  3. Each opinion target can find its corresponding modifiers in sentences through alignment, in which multiple factors are considered globally, such as co-occurrence information, word position in sentence, etc.
    Page 6, “The Proposed Method”
  4. p(vt, 210) is the co-occurrence probability of v75 and 110 based on the opinion relation identification results.
    Page 6, “The Proposed Method”
  5. But they captured relations only using co-occurrence statistics.
    Page 7, “Experiments”
  6. Second, our method captures semantic relations using topic modeling and captures opinion relations through word alignments, which are more precise than Hai which merely uses co-occurrence information to indicate such relations among words.
    Page 8, “Experiments”

See all papers in Proc. ACL 2014 that mention co-occurrence.

See all papers in Proc. ACL that mention co-occurrence.

Back to top.

F-Measure

Appears in 6 sentences as: F-Measure (4) F-measure (2) f-measure (1)
In Extracting Opinion Targets and Opinion Words from Online Reviews with Graph Co-ranking
  1. Evaluation Metrics: We select precision(P), recall(R) and f-measure (F) as metrics.
    Page 6, “Experiments”
  2. The experimental results are shown in Table 2, 3, 4 and 5, where the last column presents the average F-measure scores for multiple domains.
    Page 7, “Experiments”
  3. Due to space limitation, we only show the F-measure of CR_WP on four domains.
    Page 9, “Experiments”
  4. F-Measure F-Measure g, 01
    Page 9, “Experiments”
  5. F-Measure b) .
    Page 9, “Experiments”
  6. F-Measure
    Page 9, “Experiments”

See all papers in Proc. ACL 2014 that mention F-Measure.

See all papers in Proc. ACL that mention F-Measure.

Back to top.

word alignment

Appears in 6 sentences as: word alignment (5) word alignments (1)
In Extracting Opinion Targets and Opinion Words from Online Reviews with Graph Co-ranking
  1. They have investigated a series of techniques to enhance opinion relations identification performance, such as nearest neighbor rules (Liu et al., 2005), syntactic patterns (Zhang et al., 2010; Popescu and Etzioni, 2005), word alignment models (Liu et al., 2012; Liu et al., 2013b; Liu et al., 2013a), etc.
    Page 1, “Introduction”
  2. (Liu et al., 2012; Liu et al., 2013a; Liu et al., 2013b) employed word alignment model to capture opinion relations rather than syntactic parsing.
    Page 3, “Related Work”
  3. This approach models capturing opinion relations as a monolingual word alignment process.
    Page 6, “The Proposed Method”
  4. After performing word alignment , we obtain a set of word pairs composed of a noun (noun phrase) and its corresponding modified word.
    Page 6, “The Proposed Method”
  5. They employed a word alignment model to capture opinion relations among words, and then used a random walking algorithm to extract opinion targets.
    Page 7, “Experiments”
  6. Second, our method captures semantic relations using topic modeling and captures opinion relations through word alignments , which are more precise than Hai which merely uses co-occurrence information to indicate such relations among words.
    Page 8, “Experiments”

See all papers in Proc. ACL 2014 that mention word alignment.

See all papers in Proc. ACL that mention word alignment.

Back to top.

alignment model

Appears in 3 sentences as: alignment model (2) alignment models (1)
In Extracting Opinion Targets and Opinion Words from Online Reviews with Graph Co-ranking
  1. They have investigated a series of techniques to enhance opinion relations identification performance, such as nearest neighbor rules (Liu et al., 2005), syntactic patterns (Zhang et al., 2010; Popescu and Etzioni, 2005), word alignment models (Liu et al., 2012; Liu et al., 2013b; Liu et al., 2013a), etc.
    Page 1, “Introduction”
  2. (Liu et al., 2012; Liu et al., 2013a; Liu et al., 2013b) employed word alignment model to capture opinion relations rather than syntactic parsing.
    Page 3, “Related Work”
  3. They employed a word alignment model to capture opinion relations among words, and then used a random walking algorithm to extract opinion targets.
    Page 7, “Experiments”

See all papers in Proc. ACL 2014 that mention alignment model.

See all papers in Proc. ACL that mention alignment model.

Back to top.

topic distribution

Appears in 3 sentences as: topic distribution (3)
In Extracting Opinion Targets and Opinion Words from Online Reviews with Graph Co-ranking
  1. Thus, we employ a LDA variation (Mukherjee and Liu, 2012), an extension of (Zhao et al., 2010), to discover topic distribution on words, which sampled all words into two separated observations: opinion targets and opinion words.
    Page 6, “The Proposed Method”
  2. It’s because that we are only interested in topic distribution of opinion targets/words, regardless of other useless words, including conjunctions, prepositions etc.
    Page 6, “The Proposed Method”
  3. p(z|vt) and p(z|v0), and topic distribution Then, a symmetric Kullback—Leibler divergence as same as Eq.5 is used to calculate the semantical associations between any two homoge-nous candidates.
    Page 6, “The Proposed Method”

See all papers in Proc. ACL 2014 that mention topic distribution.

See all papers in Proc. ACL that mention topic distribution.

Back to top.

topic modeling

Appears in 3 sentences as: topic model (1) topic modeling (2)
In Extracting Opinion Targets and Opinion Words from Online Reviews with Graph Co-ranking
  1. In terms of considering semantic relations among words, our method is related with several approaches based on topic model (Zhao et al., 2010; Moghaddam and Ester, 2011; Moghaddam and Ester, 2012a; Moghaddam and Ester, 2012b; Mukherjee and Liu, 2012).
    Page 3, “Related Work”
  2. After topic modeling , we obtain the probability of the candidates (2)75 and 210) to topic 2, i.e.
    Page 6, “The Proposed Method”
  3. Second, our method captures semantic relations using topic modeling and captures opinion relations through word alignments, which are more precise than Hai which merely uses co-occurrence information to indicate such relations among words.
    Page 8, “Experiments”

See all papers in Proc. ACL 2014 that mention topic modeling.

See all papers in Proc. ACL that mention topic modeling.

Back to top.