Index of papers in Proc. ACL 2014 that mention
  • semantic parsing
Bao, Junwei and Duan, Nan and Zhou, Ming and Zhao, Tiejun
Abstract
Compared to a KB-QA system using a state-of-the-art semantic parser , our method achieves better results.
Introduction
Most previous systems tackle this task in a cascaded manner: First, the input question is transformed into its meaning representation (MR) by an independent semantic parser (Zettlemoyer and Collins, 2005; Mooney, 2007; Artzi and Zettlemoyer, 2011; Liang et al., 2011; Cai and Yates,
Introduction
Unlike existing KB-QA systems which treat semantic parsing and answer retrieval as two cascaded tasks, this paper presents a unified framework that can integrate semantic parsing into the question answering procedure directly.
Introduction
The contributions of this work are twofold: (1) We propose a translation-based KB-QA method that integrates semantic parsing and QA in one unified framework.
semantic parsing is mentioned in 11 sentences in this paper.
Topics mentioned in this paper:
Berant, Jonathan and Liang, Percy
Abstract
A central challenge in semantic parsing is handling the myriad ways in which knowledge base predicates can be expressed.
Abstract
Traditionally, semantic parsers are trained primarily from text paired with knowledge base information.
Abstract
In this paper, we turn semantic parsing on its head.
Introduction
We consider the semantic parsing problem of mapping natural language utterances into logical forms to be executed on a knowledge base (KB) (Zelle and Mooney, 1996; Zettlemoyer and Collins, 2005; Wong and Mooney, 2007; Kwiatkowski et al., 2010).
Introduction
Scaling semantic parsers to large knowledge bases has attracted substantial attention recently (Cai and Yates, 2013; Berant et al., 2013; Kwiatkowski et al., 2013), since it drives applications such as question answering (QA) and information extraction (IE).
Introduction
Semantic parsers need to somehow associate natural language phrases with logical predicates, e.g., they must learn that the constructions “What
semantic parsing is mentioned in 22 sentences in this paper.
Topics mentioned in this paper:
Krishnamurthy, Jayant and Mitchell, Tom M.
Abstract
We present an approach to training a joint syntactic and semantic parser that combines syntactic training information from CCGbank with semantic training information from a knowledge base via distant supervision.
Introduction
We suggest that a large populated knowledge base should play a key role in syntactic and semantic parsing : in training the parser, in resolving syntactic ambiguities when the trained parser is applied to new text, and in its output semantic representation.
Introduction
This paper presents an approach to training a joint syntactic and semantic parser using a large background knowledge base.
Introduction
We demonstrate our approach by training a joint syntactic and semantic parser , which we call ASP.
Parameter Estimation
Given these resources, the algorithm described in this section produces parameters 6 for a semantic parser .
Parser Design
The features are designed to share syntactic information about a word across its distinct semantic realizations in order to transfer syntactic information from CCGbank to semantic parsing .
Prior Work
This paper combines two lines of prior work: broad coverage syntactic parsing with CCG and semantic parsing .
Prior Work
Meanwhile, work on semantic parsing has focused on producing semantic parsers for answering simple natural language questions (Zelle and Mooney, 1996; Ge and Mooney, 2005; Wong and Mooney, 2006; Wong and Mooney, 2007; Lu et al., 2008; Kate and Mooney, 2006; Zettlemoyer and Collins, 2005; Kwiatkowski et al., 2011).
Prior Work
Finally, some work has looked at applying semantic parsing to answer queries against large knowledge bases, such as YAGO (Yahya et al., 2012) and Freebase (Cai and Yates, 2013b; Cai and Yates, 2013a; Kwiatkowski et al., 2013; Be-rant et al., 2013).
semantic parsing is mentioned in 19 sentences in this paper.
Topics mentioned in this paper:
Lee, Kenton and Artzi, Yoav and Dodge, Jesse and Zettlemoyer, Luke
Abstract
We present an approach for learning context-dependent semantic parsers to identify and interpret time expressions.
Formal Overview
Both detection (Section 5) and resolution (Section 6) rely on the semantic parser to identify likely mentions and resolve them within context.
Introduction
In this paper, we present the first context-dependent semantic parsing approach for learning to identify and interpret time expressions, addressing all three challenges.
Introduction
Recently, methods for learning probabilistic semantic parsers have been shown to address such limitations (Angeli et al., 2012; Angeli and Uszkoreit, 2013).
Introduction
We propose to use a context-dependent semantic parser for both detection and resolution of time expressions.
Related Work
Semantic parsers map sentences to logical representations of their underlying meaning, e. g., Zelle
Related Work
introduced the idea of learning semantic parsers to resolve time expressions (Angeli et al., 2012) and showed that the approach can generalize to multiple languages (Angeli and Uszkoreit, 2013).
Related Work
Similarly, Bethard demonstrated that a hand-engineered semantic parser is also effective (Bethard, 2013b).
Representing Time
(2012), who introduced semantic parsing for this task.
semantic parsing is mentioned in 14 sentences in this paper.
Topics mentioned in this paper:
Riezler, Stefan and Simianer, Patrick and Haas, Carolin
Abstract
We show how to generate responses by grounding SMT in the task of executing a semantic parse of a translated query against a database.
Abstract
Experiments on the GEOQUERY database show an improvement of about 6 points in Fl-score for response-based learning over learning from references only on returning the correct answer from a semantic parse of a translated query.
Grounding SMT in Semantic Parsing
In this paper, we present a proof-of-concept of our ideas of embedding SMT into simulated world environments as used in semantic parsing .
Grounding SMT in Semantic Parsing
Embedding SMT in a semantic parsing scenario means to define translation quality by the ability of a semantic parser to construct a meaning representation from the translated query, which returns the correct answer when executed against the database.
Grounding SMT in Semantic Parsing
The diagram in Figure 1 gives a sketch of response-based learning from semantic parsing in the geographical domain.
Introduction
In this paper, we propose a novel approach for learning and evaluation in statistical machine translation (SMT) that borrows ideas from response-based learning for grounded semantic parsing .
Introduction
Building on prior work in grounded semantic parsing, we generate translations of queries, and receive feedback by executing semantic parses of translated queries against the database.
Introduction
Successful response is defined as receiving the same answer from the semantic parses for the translation and the original query.
Related Work
For example, in semantic parsing , the learning goal is to produce and successfully execute a meaning representation.
Related Work
Recent attempts to learn semantic parsing from question-answer pairs without recurring to annotated logical forms have been presented by Kwiatowski et al.
Related Work
The algorithms presented in these works are variants of structured prediction that take executability of semantic parses into account.
semantic parsing is mentioned in 26 sentences in this paper.
Topics mentioned in this paper:
Yao, Xuchen and Van Durme, Benjamin
Abstract
Answering natural language questions using the Freebase knowledge base has recently been explored as a platform for advancing the state of the art in open domain semantic parsing .
Background
Finally, the KB community has developed other means for QA without semantic parsing (Lopez et al., 2005; Frank et al., 2007; Unger et al., 2012; Yahya et al., 2012; Shekarpour et al., 2013).
Conclusion
We hope that this result establishes a new baseline against which semantic parsing researchers can measure their progress towards deeper language understanding and answering of human questions.
Experiments
One question of interest is whether our system, aided by the massive web data, can be fairly compared to the semantic parsing approaches (note that Berant et al.
Introduction
The AI community has tended to approach this problem with a focus on first understanding the intent of the question, Via shallow or deep forms of semantic parsing (c.f.
Introduction
bounded by the accuracy of the original semantic parsing , and the well-formedness of resultant database queries.1
Introduction
Researchers in semantic parsing have recently explored QA over Freebase as a way of moving beyond closed domains such as GeoQuery (Tang and Mooney, 2001).
semantic parsing is mentioned in 8 sentences in this paper.
Topics mentioned in this paper:
Liu, Changsong and She, Lanbo and Fang, Rui and Chai, Joyce Y.
Evaluation and Discussion
We first applied the semantic parser and coreference classifier as described in Section 4.1 to process each dialogue, and then built a graph representation based on the automatic processing results at the end of the dialogue.
Probabilistic Labeling for Reference Grounding
Our system first processes the data using automatic semantic parsing and coreference resolution.
Probabilistic Labeling for Reference Grounding
For semantic parsing , we use a rule-based CCG parser (Bozsahin et al., 2005) to parse each utterance into a formal semantic representation.
Probabilistic Labeling for Reference Grounding
Based on the semantic parsing and pairwise coreference resolution results, our system further builds a graph representation to capture the collaborative discourse and formulate referential grounding as a probabilistic labeling problem, as described next.
Related Work
These works have provided valuable insights on how to manually and/or automatically build key components (e.g., semantic parsing , grounding functions between visual features and words, mapping procedures) for a situated referential grounding system.
semantic parsing is mentioned in 7 sentences in this paper.
Topics mentioned in this paper:
Lo, Chi-kiu and Beloucif, Meriem and Saers, Markus and Wu, Dekai
Related Work
MEANT is easily portable to other languages, requiring only an automatic semantic parser and a large monolingual corpus in the output language for identifying the semantic structures and the lexical similarity between the semantic role fillers of the reference and translation.
Related Work
Apply an input language automatic shallow semantic parser to the foreign input and an output language automatic shallow semantic parser totheMToutput.
Related Work
(Figure 2 shows examples of automatic shallow semantic parses on both foreign input and MT output.
XMEANT: a cross-lingual MEANT
To aggregate individual lexical translation probabilities into phrasal similarities between cross-lingual semantic role fillers, we compared two natural approaches to generalizing MEANT’s method of comparing semantic parses , as described below.
XMEANT: a cross-lingual MEANT
The first natural approach is to extend MEANT’s f-score based method of aggregating semantic parse accuracy, so as to also apply to aggregat-
semantic parsing is mentioned in 6 sentences in this paper.
Topics mentioned in this paper:
Flanigan, Jeffrey and Thomson, Sam and Carbonell, Jaime and Dyer, Chris and Smith, Noah A.
Introduction
Semantic parsing is the problem of mapping natural language strings into meaning representations.
Introduction
1To date, a graph transducer-based semantic parser has not been published, although the Bolinas toolkit (http://WWW.isi.eolu/publications/ licensed— sw/bolinas /) contains much of the necessary infrastructure.
Introduction
comparison of these two approaches is beyond the scope of this paper, we emphasize that—as has been observed with dependency parsing—a diversity of approaches can shed light on complex problems such as semantic parsing .
Related Work
While all semantic parsers aim to transform natural language text to a formal representation of its meaning, there is wide variation in the meaning representations and parsing techniques used.
Related Work
In contrast, semantic dependency parsing—in which the vertices in the graph correspond to the words in the sentence—is meant to make semantic parsing feasible for broader textual domains.
semantic parsing is mentioned in 5 sentences in this paper.
Topics mentioned in this paper: