Assessing the Role of Discourse References in Entailment Inference
Mirkin, Shachar and Dagan, Ido and Pado, Sebastian

Article Structure

Abstract

Discourse references, notably coreference and bridging, play an important role in many text understanding applications, but their impact on textual entailment is yet to be systematically understood.

Introduction

The detection and resolution of discourse references such as coreference and bridging anaphora play an important role in text understanding applications, like question answering and information extraction.

Background

2.1 Discourse in NLP

Motivation and Goals

The results of recent studies, as reported in Section 2.2, seem to show that current resolution of discourse references in RTE systems hardly affects performance.

Analysis Scheme

For annotating the RTE-5 data, we operationalize reference relations that are relevant for entailment as those that improve coverage.

Integrating Discourse References into Entailment Recognition

In initial analysis we found that the standard substitution operation applied by virtually all previous studies for integrating coreference into entailment is insufficient.

Results

We analyzed 120 sentence-hypothesis pairs of the RTE-5 development set (21 different hypotheses, 111 distinct sentences, 53 different documents).

Conclusions

This work has presented an analysis of the relation between discourse references and textual entailment.

Topics

coreference

Appears in 40 sentences as: Coreference (2) coreference (35) coreferences (3) coreferent (1) coreferents (1)
In Assessing the Role of Discourse References in Entailment Inference
  1. Discourse references, notably coreference and bridging, play an important role in many text understanding applications, but their impact on textual entailment is yet to be systematically understood.
    Page 1, “Abstract”
  2. The detection and resolution of discourse references such as coreference and bridging anaphora play an important role in text understanding applications, like question answering and information extraction.
    Page 1, “Introduction”
  3. The understanding that the second sentence of the text entails the hypothesis draws on two coreference relationships, namely that he is Oswald, and
    Page 1, “Introduction”
  4. However, the utilization of discourse information for such inferences has been so far limited mainly to the substitution of nominal coreferents , while many aspects of the interface between discourse and semantic inference needs remain unexplored.
    Page 1, “Introduction”
  5. An additional point of interest is the interrelation between entailment knowledge and coreference .
    Page 1, “Introduction”
  6. E.g., in Example 1 above, knowing that Kennedy was a president can alleviate the need for coreference resolution.
    Page 1, “Introduction”
  7. Conversely, coreference resolution can often be used to overcome gaps in entailment knowledge.
    Page 1, “Introduction”
  8. The simplest form of information that discourse provides is coreference , i.e., information that two linguistic expressions refer to the same entity or event.
    Page 2, “Background”
  9. Coreference is particularly important for processing pronouns and other anaphoric expressions, such as he in Example 1.
    Page 2, “Background”
  10. While coreference indicates equivalence, bridging points to the existence of a salient semantic relation between two distinct entities or events.
    Page 2, “Background”
  11. Note, however, that text understanding systems are generally limited to the resolution of entity (or even just pronoun) coreference , e.g.
    Page 2, “Background”

See all papers in Proc. ACL 2010 that mention coreference.

See all papers in Proc. ACL that mention coreference.

Back to top.

coreference resolution

Appears in 9 sentences as: coreference resolution (8) coreference resolvers (1)
In Assessing the Role of Discourse References in Entailment Inference
  1. E.g., in Example 1 above, knowing that Kennedy was a president can alleviate the need for coreference resolution .
    Page 1, “Introduction”
  2. Conversely, coreference resolution can often be used to overcome gaps in entailment knowledge.
    Page 1, “Introduction”
  3. A number of systems have tried to address the question of coreference in RTE as a preprocessing step prior to inference proper, with most systems using off-the-shelf coreference resolvers such as JavaRap (Qiu et al., 2004) or OpenNLP3.
    Page 3, “Background”
  4. Results were inconclusive, however, with several reports about errors introduced by automatic coreference resolution (Agichtein et al., 2008; Adams et al., 2007).
    Page 3, “Background”
  5. Specific evaluations of the contribution of coreference resolution yielded both small negative (Bar-Haim et al., 2008) and insignificant positive (Chambers et al., 2007) results.
    Page 3, “Background”
  6. sented; (2) the off-the-shelf coreference resolution systems which may have been not robust enough; (3) the limitation to nominal coreference; and (4) overly simple integration of reference information into the inference engines.
    Page 3, “Motivation and Goals”
  7. Table 2 shows that 77% of all focus terms and 86% of the reference terms were nominal phrases, which justifies their prominent position in work on anaphora and coreference resolution .
    Page 8, “Results”
  8. This result reaffirms the usefulness of cross-document coreference resolution for inference (Huang et al., 2009).
    Page 8, “Results”
  9. While semantic knowledge (e.g., from WordNet or Wikipedia) has been used beneficially for coreference resolution (Soon et al., 2001; Ponzetto and Strube, 2006), reference resolution has, to our knowledge, not yet been employed to validate entailment rules’ applicability.
    Page 9, “Conclusions”

See all papers in Proc. ACL 2010 that mention coreference resolution.

See all papers in Proc. ACL that mention coreference resolution.

Back to top.

dependency trees

Appears in 4 sentences as: dependency tree (1) Dependency trees (1) dependency trees (2)
In Assessing the Role of Discourse References in Entailment Inference
  1. Specifically, we assume MINIPAR-style (Lin, 1993) dependency trees where nodes represent text expressions and edges represent the syntactic relations between them.
    Page 4, “Analysis Scheme”
  2. Dependency trees are a popular choice in RTE since they offer a fairly semantics-oriented account of the sentence structure that can still be constructed robustly.
    Page 4, “Analysis Scheme”
  3. Transformations create revised trees that cover previously uncovered target components in H. The output of each transformation, T1, is comprised of copies of the components used to construct it, and is appended to the discourse forest, which includes the dependency trees of all sentences and their generated consequents.
    Page 5, “Integrating Discourse References into Entailment Recognition”
  4. We assume that we have access to a dependency tree for H, a dependency forest for T and its discourse context, as well as the output of a perfect discourse processor, i.e., a complete set of both coreference and bridging relations, including the type of bridging relation (e. g. part-0f, cause).
    Page 5, “Integrating Discourse References into Entailment Recognition”

See all papers in Proc. ACL 2010 that mention dependency trees.

See all papers in Proc. ACL that mention dependency trees.

Back to top.

subtrees

Appears in 3 sentences as: subtrees (3)
In Assessing the Role of Discourse References in Entailment Inference
  1. We use “term” to refer to text expressions, and “components” to refer to nodes, edges, and subtrees .
    Page 4, “Analysis Scheme”
  2. Figure 1: The Substitution transformation, demonstrated on the relevant subtrees of Example (i).
    Page 6, “Integrating Discourse References into Entailment Recognition”
  3. For each bridging relation, it adds a specific subtrees 87" via an edge labeled with labr.
    Page 7, “Integrating Discourse References into Entailment Recognition”

See all papers in Proc. ACL 2010 that mention subtrees.

See all papers in Proc. ACL that mention subtrees.

Back to top.