Logical Inference on Dependency-based Compositional Semantics
Tian, Ran and Miyao, Yusuke and Matsuzaki, Takuya

Article Structure

Abstract

Dependency-based Compositional Semantics (DCS) is a framework of natural language semantics with easy-to-process structures as well as strict semantics.

Introduction

Dependency-based Compositional Semantics (DCS) provides an intuitive way to model semantics of questions, by using simple dependency-like trees (Liang et al., 2011).

The Idea

In this section we describe the idea of representing natural language semantics by DCS trees, and achieving inference by computing logical relations among the corresponding abstract denotations.

Generating On-the-fly Knowledge

Recognizing textual entailment (RTE) is the task of determining whether a given textual statement H can be inferred by a text passage T. For this, our primary textual inference system operates as:

Experiments

In this section, we evaluate our system on FraCaS (§4.2) and PASCAL RTE datasets (§4.3).

Conclusion and Discussion

We have presented a method of deriving abstract denotation from DCS trees, which enables logical inference on DCS, and we developed a textual inference system based on the framework.

Topics

semantic roles

Appears in 15 sentences as: semantic role (6) semantic roles (9) semantics roles (1)
In Logical Inference on Dependency-based Compositional Semantics
  1. The labels on both ends of an edge, such as SUBJ (subject) and OBJ (object), are considered as semantic roles of the cor-
    Page 2, “The Idea”
  2. where read, student and book denote sets represented by these words respectively, and wr represents the set 21) considered as the domain of the semantic role r (e.g.
    Page 2, “The Idea”
  3. 1The semantic role ARG is specifically defined for denoting nominal predicate.
    Page 2, “The Idea”
  4. 0 m: projection onto domain of semantic role 7“ (e.g.
    Page 3, “The Idea”
  5. Generally we admit projections onto multiple semantics roles, denoted by 7m where R is a set of semantic roles .
    Page 3, “The Idea”
  6. To obtain DCS trees from natural language, we use Stanford CoreNLP5 for dependency parsing (Socher et al., 2013), and convert Stanford dependencies to DCS trees by pattern matching on POS tags and dependency labels.6 Currently we use the following semantic roles : ARG, SUBJ, OBJ, IOBJ, TIME and MOD.
    Page 3, “The Idea”
  7. The semantic role MOD is used for any restrictive modifiers.
    Page 3, “The Idea”
  8. A DCS tree ’2' = (N, 5) is defined as a rooted tree, where each node 0 E N is labeled with a content word 212(0) and each edge (a, 0’) E 5 C N x N is labeled with a pair of semantic roles (r, r’)7.
    Page 3, “The Idea”
  9. is the subtree of ’2' rooted at 73-, and R0— is the set of possible semantic roles for content word 212(0) (e.g.
    Page 4, “The Idea”
  10. A path is considered as joining two germs in a DCS tree, where a germ is defined as a specific semantic role of a node.
    Page 5, “Generating On-the-fly Knowledge”
  11. The abstract denotation of a germ is defined in a top-down manner: for the root node p of a DCS tree ’2', we define its denotation [[p]]7 as the denotation of the entire tree [[7]]; for a non-root node 7' and its parent node a, let the edge (a, 7') be labeled by semantic roles (r, r’), then define
    Page 6, “Generating On-the-fly Knowledge”

See all papers in Proc. ACL 2014 that mention semantic roles.

See all papers in Proc. ACL that mention semantic roles.

Back to top.

natural language

Appears in 10 sentences as: Natural language (1) natural language (9)
In Logical Inference on Dependency-based Compositional Semantics
  1. Dependency-based Compositional Semantics (DCS) is a framework of natural language semantics with easy-to-process structures as well as strict semantics.
    Page 1, “Abstract”
  2. It is expressive enough to represent complex natural language queries on a relational database, yet simple enough to be latently learned from question-answer pairs.
    Page 1, “Introduction”
  3. In this section we describe the idea of representing natural language semantics by DCS trees, and achieving inference by computing logical relations among the corresponding abstract denotations.
    Page 1, “The Idea”
  4. DCS trees has been proposed to represent natural language semantics with a structure similar to dependency trees (Liang et al., 2011) (Figure 1).
    Page 2, “The Idea”
  5. 2.4.1 Natural language to DCS trees
    Page 3, “The Idea”
  6. To obtain DCS trees from natural language , we use Stanford CoreNLP5 for dependency parsing (Socher et al., 2013), and convert Stanford dependencies to DCS trees by pattern matching on POS tags and dependency labels.6 Currently we use the following semantic roles: ARG, SUBJ, OBJ, IOBJ, TIME and MOD.
    Page 3, “The Idea”
  7. These are algebraic properties of abstract denotations, among which we choose a set of axioms that can be handled efficiently and enable most common types of inference seen in natural language .
    Page 4, “The Idea”
  8. The pursue of a logic more suitable for natural language inference is not new.
    Page 9, “Conclusion and Discussion”
  9. Much work has been done in mapping natural language into database queries (Cai and Yates, 2013; Kwiatkowski et al., 2013; Poon, 2013).
    Page 9, “Conclusion and Discussion”
  10. can thus be considered as an attempt to characterize a fragment of FOL that is suited for both natural language inference and transparent syntax-semantics mapping, through the choice of operations and relations on sets.
    Page 9, “Conclusion and Discussion”

See all papers in Proc. ACL 2014 that mention natural language.

See all papers in Proc. ACL that mention natural language.

Back to top.

similarity score

Appears in 9 sentences as: similarity score (6) similarity scores (3)
In Logical Inference on Dependency-based Compositional Semantics
  1. Aligned paths are evaluated by a similarity score to estimate their likelihood of being paraphrases.
    Page 5, “Generating On-the-fly Knowledge”
  2. Aligned paths are evaluated by a similarity score , for which we use distributional similarity of the words that appear in the paths (§4.1).
    Page 6, “Generating On-the-fly Knowledge”
  3. Only path alignments with high similarity scores can be accepted.
    Page 6, “Generating On-the-fly Knowledge”
  4. To calculate the similarity scores of path alignments, we use the sum of word vectors of the words from each path, and calculate the cosine similarity.
    Page 7, “Experiments”
  5. For example, the similarity score of the path alignment “OB J (blame) I OB J -ARG(death) m SUB J (cause)OB J -ARG(loss)MOD-ARG(life)” is calculated as the cosine similarity of vectors blame+death and cause+loss+life.
    Page 7, “Experiments”
  6. 60%, which is fairly high, given our rough estimation of the similarity score .
    Page 8, “Experiments”
  7. A major type of error is caused by the ignorance of semantic roles in calculation of similarity scores .
    Page 8, “Experiments”
  8. For example, though “Italy beats Kazakhstan” is not primarily proven from “Italy is defeated by Kazakhstan”, our system does produce the path alignment “SUBJ(beat)OBJ m OBJ(defeat)SUBJ” with a high similarity score .
    Page 8, “Experiments”
  9. has a corresponding path of length g 5 in T with a similarity score > 0.4.
    Page 9, “Experiments”

See all papers in Proc. ACL 2014 that mention similarity score.

See all papers in Proc. ACL that mention similarity score.

Back to top.

content word

Appears in 5 sentences as: content word (5) Content words (1)
In Logical Inference on Dependency-based Compositional Semantics
  1. Our solution is to redefine DCS trees without the aid of any databases, by considering each node of a DCS tree as a content word in a sentence (but may no longer be a table in a specific database), while each edge represents semantic relations between two words.
    Page 2, “The Idea”
  2. 0 Content words: a content word (e.g.
    Page 3, “The Idea”
  3. A DCS tree ’2' = (N, 5) is defined as a rooted tree, where each node 0 E N is labeled with a content word 212(0) and each edge (a, 0’) E 5 C N x N is labeled with a pair of semantic roles (r, r’)7.
    Page 3, “The Idea”
  4. is the subtree of ’2' rooted at 73-, and R0— is the set of possible semantic roles for content word 212(0) (e.g.
    Page 4, “The Idea”
  5. Using disj ointness we implemented two types of negations: (i) atomic negation, for each content word 21) we allow negation 217 of that word, characterized by the property 212 2D; and (ii) root negation, for a DCS tree ’2' and its denotation [[2]], the negation of ’2' is represented by ’2' ’2', meaning that ’2' = (Z) in its effect.
    Page 4, “The Idea”

See all papers in Proc. ACL 2014 that mention content word.

See all papers in Proc. ACL that mention content word.

Back to top.

coreference

Appears in 4 sentences as: Coreference (2) coreference (3) coreferences (1)
In Logical Inference on Dependency-based Compositional Semantics
  1. DCS trees can be extended to represent linguistic phenomena such as quantification and coreference , with additional markers introducing additional operations on tables.
    Page 2, “The Idea”
  2. Coreference We use Stanford CoreNLP to resolve coreferences (Raghunathan et al., 2010), whereas coreference is implemented as a special type of selection.
    Page 5, “The Idea”
  3. For a TH pair, apply dependency parsing and coreference resolution.
    Page 5, “Generating On-the-fly Knowledge”
  4. Parsing H Abstract I T/H Coreference DCS trees denotations -
    Page 5, “Generating On-the-fly Knowledge”

See all papers in Proc. ACL 2014 that mention coreference.

See all papers in Proc. ACL that mention coreference.

Back to top.

dependency parsing

Appears in 4 sentences as: dependency parser (1) dependency parses (1) dependency parsing (2)
In Logical Inference on Dependency-based Compositional Semantics
  1. To obtain DCS trees from natural language, we use Stanford CoreNLP5 for dependency parsing (Socher et al., 2013), and convert Stanford dependencies to DCS trees by pattern matching on POS tags and dependency labels.6 Currently we use the following semantic roles: ARG, SUBJ, OBJ, IOBJ, TIME and MOD.
    Page 3, “The Idea”
  2. For a TH pair, apply dependency parsing and coreference resolution.
    Page 5, “Generating On-the-fly Knowledge”
  3. Perform rule-based conversion from dependency parses to DCS trees, which are translated to statements on abstract denotations.
    Page 5, “Generating On-the-fly Knowledge”
  4. Since our system uses an off-the-shelf dependency parser , and semantic representations are obtained from simple rule-based conversion from dependency trees, there will be only one (right or wrong) interpretation in face of ambiguous sentences.
    Page 7, “Experiments”

See all papers in Proc. ACL 2014 that mention dependency parsing.

See all papers in Proc. ACL that mention dependency parsing.

Back to top.

dependency trees

Appears in 3 sentences as: dependency trees (3)
In Logical Inference on Dependency-based Compositional Semantics
  1. DCS trees has been proposed to represent natural language semantics with a structure similar to dependency trees (Liang et al., 2011) (Figure 1).
    Page 2, “The Idea”
  2. We obtain DCS trees from dependency trees , to bypass the need of a concrete database.
    Page 3, “The Idea”
  3. Since our system uses an off-the-shelf dependency parser, and semantic representations are obtained from simple rule-based conversion from dependency trees , there will be only one (right or wrong) interpretation in face of ambiguous sentences.
    Page 7, “Experiments”

See all papers in Proc. ACL 2014 that mention dependency trees.

See all papers in Proc. ACL that mention dependency trees.

Back to top.

semantic representation

Appears in 3 sentences as: semantic representation (2) semantic representations (1)
In Logical Inference on Dependency-based Compositional Semantics
  1. Optimistically, we believe DCS can provide a framework of semantic representation with sufficiently wide coverage for real-world texts.
    Page 2, “The Idea”
  2. Since our system uses an off-the-shelf dependency parser, and semantic representations are obtained from simple rule-based conversion from dependency trees, there will be only one (right or wrong) interpretation in face of ambiguous sentences.
    Page 7, “Experiments”
  3. Other directions of our future work include further exploitation of the new semantic representation .
    Page 9, “Conclusion and Discussion”

See all papers in Proc. ACL 2014 that mention semantic representation.

See all papers in Proc. ACL that mention semantic representation.

Back to top.