Index of papers in Proc. ACL that mention
  • logical forms
Espinosa, Dominic and White, Michael and Mehay, Dennis
Background
Edges are grouped into equivalence classes when they have the same syntactic category and cover the same parts of the input logical form .
Background
Additionally, to realize a wide range of paraphrases, OpenCCG implements an algorithm for efficiently generating from disjunctive logical forms (White, 2006a).
Background
the one that covers the most elementary predications in the input logical form , with ties broken according to the n-gram score.
Introduction
We have adapted this multitagging approach to lexical category assignment for realization using the CCG-based natural language toolkit OpenCCG.1 Instead of basing category assignment on linear word and POS context, however, we predict lexical categories based on contexts within a directed graph structure representing the logical form (LP) of a proposition to be realized.
logical forms is mentioned in 13 sentences in this paper.
Topics mentioned in this paper:
Liang, Percy and Jordan, Michael and Klein, Dan
Abstract
Compositional question answering begins by mapping questions to logical forms, but training a semantic parser to perform this mapping typically requires the costly annotation of the target logical forms .
Abstract
In this paper, we learn to map questions to answers via latent logical forms , which are induced automatically from question-answer pairs.
Abstract
In tackling this challenging learning problem, we introduce a new semantic representation which highlights a parallel between dependency syntax and efficient evaluation of logical forms .
Introduction
Answering these types of complex questions compositionally involves first mapping the questions into logical forms (semantic parsing).
Introduction
Supervised semantic parsers (Zelle and Mooney, 1996; Tang and Mooney, 2001; Ge and Mooney, 2005; Zettlemoyer and Collins, 2005; Kate and Mooney, 2007; Zettlemoyer and Collins, 2007; Wong and Mooney, 2007; Kwiatkowski et al., 2010) rely on manual annotation of logical forms , which is expensive.
Introduction
(2010), we obviate the need for annotated logical forms by considering the end-to-end problem of mapping questions to answers.
logical forms is mentioned in 32 sentences in this paper.
Topics mentioned in this paper:
Poon, Hoifung
Experiments
The ZC07 dataset contains annotated logical forms for each sentence, which we do not use.
Experiments
Since our goal is not to produce a specific logical form , we directly evaluate on the end-to-end task of translating questions into database queries and measure question-answering accuracy.
Experiments
Both ZC07 and FUBL used annotated logical forms in training, whereas GUSP-FULL and GUSP++ did not.
Grounded Unsupervised Semantic Parsing
GUSP is unsupervised and does not require example logical forms or question-answer pairs.
Introduction
Semantic parsing maps text to a formal meaning representation such as logical forms or structured queries.
Introduction
However, although these methods exonerate annotators from mastering specialized logical forms , finding the answers for complex ques-
Introduction
While their approach completely obviates the need for direct supervision, their target logic forms are self-induced clusters, which do not align with existing database or ontology.
logical forms is mentioned in 14 sentences in this paper.
Topics mentioned in this paper:
Berant, Jonathan and Liang, Percy
Abstract
Given an input utterance, we first use a simple method to deterministically generate a set of candidate logical forms with a canonical realization in natural language for each.
Abstract
Then, we use a paraphrase model to choose the realization that best paraphrases the input, and output the corresponding logical form .
Introduction
We consider the semantic parsing problem of mapping natural language utterances into logical forms to be executed on a knowledge base (KB) (Zelle and Mooney, 1996; Zettlemoyer and Collins, 2005; Wong and Mooney, 2007; Kwiatkowski et al., 2010).
Introduction
7igure 1: Semantic parsing via paraphrasing: For each andidate logical form (in red), we generate canonical utter-nces (in purple).
Introduction
liven an input utterance, we first use a simple de-erministic procedure to construct a manageable et of candidate logical forms (ideally, we would generate canonical utterances for all possible logi-tal forms, but this is intractable).
logical forms is mentioned in 48 sentences in this paper.
Topics mentioned in this paper:
Krishnamurthy, Jayant and Mitchell, Tom M.
Abstract
The trained parser produces a full syntactic parse of any sentence, while simultaneously producing logical forms for portions of the sentence that have a semantic representation within the parser’s predicate vocabulary.
Abstract
A semantic evaluation demonstrates that this parser produces logical forms better than both comparable prior work and a pipelined syntax-then-semantics approach.
Introduction
Our parser produces a full syntactic parse of every sentence, and furthermore produces logical forms for portions of the sentence that have a semantic representation within the parser’s predicate vocabulary.
Introduction
For example, given a phrase like “my favorite town in California,” our parser will assign a logical form like Ax.CITY(x) /\ LOCATEDIN(:E, CALIFORNIA) to the “town in California” portion.
Introduction
ASP produces a full syntactic analysis of every sentence while simultaneously producing logical forms containing any of 61 category and 69 re-
Parser Design
The input to the parser is a part-of-speech tagged sentence, and the output is a syntactic CCG parse tree, along with zero or more logical forms representing the semantics of subspans of the sentence.
Parser Design
These logical forms are constructed using category and relation predicates from a broad coverage knowledge base.
Parser Design
The parser uses category and relation predicates from a broad coverage knowledge base both to construct logical forms and to parametrize the parsing model.
Prior Work
This line of work has typically used a corpus of sentences with annotated logical forms to train the parser.
logical forms is mentioned in 65 sentences in this paper.
Topics mentioned in this paper:
Lee, Kenton and Artzi, Yoav and Dodge, Jesse and Zettlemoyer, Luke
Detection
We use a CKY algorithm to efficiently determine which phrases the CCG grammar can parse and only allow logical forms for which there exists some context in which they would produce a valid time expression, e.g.
Parsing Time Expressions
First, we use a CCG to generate an initial logical form for the mention.
Parsing Time Expressions
initial logical form , as appropriate for its context.
Parsing Time Expressions
Finally, the logical form is resolved to a TIMEX3 value using a deterministic process.
logical forms is mentioned in 13 sentences in this paper.
Topics mentioned in this paper:
Cai, Qingqing and Yates, Alexander
Experiments
Figure 2: Example questions with their logical forms .
Experiments
The logical forms make use of Freebase symbols as logical constants, as well as a few additional symbols such as count and argmin, to allow for aggregation queries.
Experiments
We also created a dataset of alignments from these annotated questions by creating an alignment for each Freebase relation mentioned in the logical form for a question, paired with a manually-selected word from the question.
Extending a Semantic Parser Using a Schema Alignment
Using A, UBL selects a logical form 2 for a sentence 8 by selecting the z with the most likely parse derivations y: h(S) = arg maxz 2y p(y, z|$; 6, A).
Extending a Semantic Parser Using a Schema Alignment
The probabilistic model is a log-linear model with features for lexical entries used in the parse, as well as indicator features for relation-argument pairs in the logical form , to capture selectional preferences.
Extending a Semantic Parser Using a Schema Alignment
Inference (parsing) and parameter estimation are driven by standard dynamic programming algorithms (Clark and Curran, 2007), while lexicon induction is based on a novel search procedure through the space of possible higher-order logic unification operations that yield the desired logical form for a training sentence.
Introduction
(sentence, Relations logical form ) Freebase Extracted
logical forms is mentioned in 9 sentences in this paper.
Topics mentioned in this paper:
Beltagy, Islam and Erk, Katrin and Mooney, Raymond
Background
The MLN constructed to determine the probability of a given entailment includes the logical forms for both T and H as well as soft inference rules that are constructed from distributional information.
Background
To determine an entailment probability, first, the two sentences are mapped to logical representations using Boxer (B08, 2008), a tool for wide-coverage semantic analysis that maps a CCG (Combinatory Categorial Grammar) parse into a lexically-based logical form .
Evaluation
This system uses PSL to compute similarity of logical forms but does not use distributional information on lexical or phrasal similarity.
PSL for STS
First, it is explicitly designed to support efficient inference, therefore it scales better to longer sentences with more complex logical forms .
PSL for STS
Given the logical forms for a pair of sentences, a text T and a hypothesis H, and given a set of weighted rules derived from the distributional semantics (as explained in section 2.6) composing the knowledge base KB, we build a PSL model that supports determining the truth value of H in the most probable interpretation (i.e.
PSL for STS
Parsing into logical form gives:
logical forms is mentioned in 6 sentences in this paper.
Topics mentioned in this paper:
Poon, Hoifung and Domingos, Pedro
Background 2.1 Ontology Learning
It can be viewed as a structured prediction problem, where a semantic parse is formed by partitioning the input sentence (or a syntactic analysis such as a dependency tree) into meaning units and assigning each unit to the logical form representing an entity or relation (Figure 1).
Background 2.1 Ontology Learning
Top: semantic parsing converts an input sentence into logical form in Davidsonian semantics.
Background 2.1 Ontology Learning
parser extracts knowledge from input text and converts them into logical form (the semantic parse), which can then be used in logical and probabilistic inference and support end tasks such as question answering.
Introduction
We propose OntoUSP (Ontological USP), a system that learns an ISA hierarchy over clusters of logical expressions, and populates it by translating sentences to logical form .
logical forms is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Angeli, Gabor and Uszkoreit, Jakob
Related Work
For example, Zettlemoyer and Collins (2007) learn a mapping from textual queries to a logical form .
Related Work
Importantly, the logical form of these parses contain all of the predicates and entities used in the parse — unlike the label provided in our case, where a grounded time can correspond to any of a number of latent parses.
Related Work
(2011) relax supervision to require only annotated answers rather than full logical forms .
logical forms is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Packard, Woodley and Bender, Emily M. and Read, Jonathon and Oepen, Stephan and Dridan, Rebecca
Discussion and Comparison
Both systems map from logical forms with explicit representations of scope of negation out to string-based annotations in the format provided by the Shared Task gold standard.
Discussion and Comparison
The main points of difference are in the robustness of the system and in the degree of tailoring of both the rules for determining scope on the logical form level and the rules for handling semantically vacuous elements.
Related Work
(2012) describe some amount of tailoring of the Boxer lexicon to include more of the Shared Task scope cues among those that produce the negation operator in the DRSs, but otherwise the system appears to directly take the notion of scope of negation from the DRS and project it out to the string, with one caveat: As with the logical-forms representations we use, the DRS logical forms do not include function words as predicates in the semantics.
System Description
From these underspecified representations of possible scopal configurations, a scope resolution component can spell out the full range of fully-connected logical forms (Koller and Thater, 2005), but it turns out that such enumeration is not relevant here: the notion of scope encoded in the Shared Task annotations is not concerned with the relative scope of quantifiers and negation, such as the two possible readings of (2) represented informally below:5
logical forms is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Veale, Tony and Hao, Yanfen and Li, Guofu
Related Work
(1999), in which each of the textual glosses in WordNet (Fellbaum, 1998) is linguistically analyzed to yield a sense-tagged logical form , is an example of the former approach.
Tagging and Mapping of Similes
and reputable|§ 4%, are associated with the same logical form in HowNet, which defines them as a specialization of ReputationValue|QZ $13.
Tagging and Mapping of Similes
This allows us to safely identify “%%” with this logical form .
logical forms is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Gyawali, Bikash and Gardent, Claire
Related Work
Earlier work on concept to text generation mainly focuses on generation from logical forms using rule-based methods.
Related Work
(Wang, 1980) uses handwritten rules to generate sentences from an extended predicate logic formalism; (Shieber et al., 1990) introduces a head-driven algorithm for generating from logical forms ; (Kay, 1996) defines a chart based algorithm which enhances efficiency by minimising the number of semantically incomplete phrases being built; and (Shemtov, 1996) presents an extension of the chart based generation algorithm presented in (Kay, 1996) which supports the generation of multiple paraphrases from underspecified semantic input.
Related Work
(Lu and Ng, 2011) focuses on generating natural language sentences from logical form (i.e., lambda terms) using a synchronous context-free grammar.
logical forms is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Riezler, Stefan and Simianer, Patrick and Haas, Carolin
Experiments
(2012).3 The dataset includes 880 English questions and their logical forms .
Experiments
This parser is itself based on SMT, trained on parallel data consisting of English queries and linearized logical forms, and on a language model trained on linearized logical forms .
Related Work
Recent attempts to learn semantic parsing from question-answer pairs without recurring to annotated logical forms have been presented by Kwiatowski et al.
logical forms is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: