Index of papers in Proc. ACL that mention
  • syntactic parsing
Ge, Ruifang and Mooney, Raymond
Abstract
Unlike previous methods, it exploits an existing syntactic parser to produce disam-biguated parse trees that drive the compositional semantic interpretation.
Ensuring Meaning Composition
3 only works if the syntactic parse tree strictly follows the predicate-argument structure of the MR, since meaning composition at each node is assumed to combine a predicate with one of its arguments.
Ensuring Meaning Composition
1(a) according to the syntactic parse in Fig.
Ensuring Meaning Composition
Macro-predicates are introduced as needed during training in order to ensure that each MR in the training set can be composed using the syntactic parse of its corresponding NL given reasonable assignments of predicates to words.
Introduction
Previous methods for learning semantic parsers do not utilize an existing syntactic parser that provides disambiguated parse trees.1 However, accurate syntactic parsers are available for many
Semantic Parsing Framework
Th framework is composed of three components: 1 an existing syntactic parser to produce parse tree for NL sentences; 2) learned semantic knowledg
Semantic Parsing Framework
5), including a semantic lexicon to assign possible predicates (meanings) to words, and a set of semantic composition rules to construct possible MRs for each internal node in a syntactic parse given its children’s MRs; and 3) a statistical disambiguation model (cf.
Semantic Parsing Framework
First, the syntactic parser produces a parse tree for the NL sentence.
syntactic parsing is mentioned in 31 sentences in this paper.
Topics mentioned in this paper:
Hirao, Tsutomu and Suzuki, Jun and Isozaki, Hideki
A Syntax Free Sequence-oriented Sentence Compression Method
As an alternative to syntactic parsing , we propose two novel features, intra-sentence positional term weighting (IPTW) and the patched language model (PLM) for our syntax-free sentence compressor.
Abstract
Conventional sentence compression methods employ a syntactic parser to compress a sentence without changing its meaning.
Abstract
As an alternative to syntactic parsing , we propose a novel term weighting technique based on the positional information within the original sentence and a novel language model that combines statistics from the original sentence and a general corpus.
Abstract
Because our method does not use a syntactic parser , it is 4.3 times faster than Hori’s method.
Analysis of reference compressions
In addition, sentence compression methods that strongly depend on syntactic parsers have two problems: ‘parse error’ and ‘decoding speed.’ 44% of sentences output by a state-of-the-art Japanese dependency parser contain at least one error (Kudo and Matsumoto, 2005).
Conclusions
It is significantly superior to the methods that employ syntactic parsers .
Conclusions
0 As an alternative to the syntactic parser , we proposed two novel features, Intra-sentence positional term weighting (IPTW) and the Patched language model (PLM), and showed their effectiveness by conducting automatic and human evaluations,
Introduction
In accordance with this idea, conventional sentence compression methods employ syntactic parsers .
Introduction
To maintain the subject-predicate relationship in the compressed sentence and retain fluency without using syntactic parsers , we propose two novel features: intra-sentence positional term weighting (IPTW) and the patched language model (PLM).
Introduction
superior to conventional sequence-oriented methods that employ syntactic parsers while being about 4.3 times faster.
Related work
Moreover, their use of syntactic parsers seriously degrades the decoding speed.
syntactic parsing is mentioned in 11 sentences in this paper.
Topics mentioned in this paper:
Krishnamurthy, Jayant and Mitchell, Tom M.
Abstract
The trained parser produces a full syntactic parse of any sentence, while simultaneously producing logical forms for portions of the sentence that have a semantic representation within the parser’s predicate vocabulary.
Introduction
Integrating syntactic parsing with semantics has long been a goal of natural language processing and is expected to improve both syntactic and semantic processing.
Introduction
For example, semantics could help predict the differing prepositional phrase attachments in “I caught the butterfly with the net” and “I caught the butterfly with the spots A joint analysis could also avoid propagating syntactic parsing errors into semantic processing, thereby improving performance.
Introduction
ideally improve the parser’s ability to solve difficult syntactic parsing problems, as in the examples above.
Prior Work
This paper combines two lines of prior work: broad coverage syntactic parsing with CCG and semantic parsing.
Prior Work
Broad coverage syntactic parsing with CCG has produced both resources and successful parsers.
Prior Work
The parser presented in this paper can be viewed as a combination of both a broad coverage syntactic parser and a semantic parser trained using distant supervision.
syntactic parsing is mentioned in 24 sentences in this paper.
Topics mentioned in this paper:
Miyao, Yusuke and Saetre, Rune and Sagae, Kenji and Matsuzaki, Takuya and Tsujii, Jun'ichi
Conclusion and Future Work
We have presented our attempts to evaluate syntactic parsers and their representations that are based on different frameworks; dependency parsing, phrase structure parsing, or deep parsing.
Experiments
(2006) do not rely on syntactic parsing , While the former applied SVMs with kernels on surface strings and the latter is similar to our baseline method.
Introduction
Parsing technologies have improved considerably in the past few years, and high-performance syntactic parsers are no longer limited to PCFG—based frameworks (Charniak, 2000; Klein and Manning, 2003; Charniak and Johnson, 2005 ; Petrov and Klein, 2007), but also include dependency parsers (McDonald and Pereira, 2006; Nivre and Nilsson, 2005; Sagae and Tsujii, 2007) and deep parsers (Kaplan et al., 2004; Clark and Curran, 2004; Miyao and Tsujii, 2008).
Introduction
However, efforts to perform extensive comparisons of syntactic parsers based on different frameworks have been limited.
Introduction
In this paper, we present a comparative evaluation of syntactic parsers and their output representations based on different frameworks: dependency parsing, phrase structure parsing, and deep parsing.
Related Work
Though the evaluation of syntactic parsers has been a major concern in the parsing community, and a couple of works have recently presented the comparison of parsers based on different frameworks, their methods were based on the comparison of the parsing accuracy in terms of a certain intermediate parse representation (Ringger et al., 2004; Kaplan et al., 2004; Briscoe and Carroll, 2006; Clark and Curran, 2007; Miyao et al., 2007; Clegg and Shepherd, 2007; Pyysalo et al., 2007b; Pyysalo et al., 2007a; Sagae et al., 2008).
syntactic parsing is mentioned in 8 sentences in this paper.
Topics mentioned in this paper:
Shindo, Hiroyuki and Miyao, Yusuke and Fujino, Akinori and Nagata, Masaaki
Abstract
We propose Symbol-Refined Tree Substitution Grammars (SR-TSGs) for syntactic parsing .
Introduction
Syntactic parsing has played a central role in natural language processing.
Introduction
tree fragments and symbol refinement work complementarily for syntactic parsing .
Introduction
In this paper, we propose Symbol-Refined Tree Substitution Grammars (SR-TSGs) for syntactic parsing .
Symbol-Refined Tree Substitution Grammars
In this section, we propose Symbol-Refined Tree Substitution Grammars (SR-TSGs) for syntactic parsing .
syntactic parsing is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Yang, Nan and Li, Mu and Zhang, Dongdong and Yu, Nenghai
Abstract
In this work, we further extend this line of exploration and propose a novel but simple approach, which utilizes a ranking model based on word order precedence in the target language to reposition nodes in the syntactic parse tree of a source sentence.
Abstract
The ranking model is automatically derived from word aligned parallel data with a syntactic parser for source language based on both lexical and syntactical features.
Introduction
The most notable solution to this problem is adopting syntaX-based SMT models, especially methods making use of source side syntactic parse trees.
Introduction
The other is called syntax pre-reordering — an approach that re-positions source words to approximate target language word order as much as possible based on the features from source syntactic parse trees.
Introduction
In this paper, we continue this line of work and address the problem of word reordering based on source syntactic parse trees for SMT.
syntactic parsing is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Abend, Omri and Reichart, Roi and Rappoport, Ari
Abstract
The algorithm makes use of a fully unsupervised syntactic parser , using its output in order to detect clauses and gather candidate argument collocation statistics.
Conclusion
The recent availability of unsupervised syntactic parsers has offered an opportunity to conduct research on SRL, without reliance on supervised syntactic annotation.
Related Work
Using VerbNet along with the output of a rule-based chunker (in 2004) and a supervised syntactic parser (in 2005), they spot instances in the corpus that are very similar to the syntactic patterns listed in VerbNet.
Related Work
Clause information has been applied to accelerating a syntactic parser (Glaysher and Moldovan, 2006).
syntactic parsing is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Zhang, Meishan and Zhang, Yue and Che, Wanxiang and Liu, Ting
Character-Level Dependency Tree
A transition-based framework with global learning and beam search decoding (Zhang and Clark, 2011) has been applied to a number of natural language processing tasks, including word segmentation, PCS-tagging and syntactic parsing (Zhang and Clark, 2010; Huang and Sagae, 2010; Bohnet and Nivre, 2012; Zhang et al., 2013).
Character-Level Dependency Tree
Both are crucial to well-established features for word segmentation, PCS-tagging and syntactic parsing .
Character-Level Dependency Tree
(2013) was the first to perform Chinese syntactic parsing over characters.
Introduction
Second, word internal structures can also be useful for syntactic parsing .
syntactic parsing is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Li, Junhui and Marton, Yuval and Resnik, Philip and Daumé III, Hal
Experiments
To obtain syntactic parse trees and semantic roles on the tuning and test datasets, we first parse the source sentences with the Berkeley Parser (Petrov and Klein, 2007), trained on the Chinese Treebank 7.0 (Xue et al., 2005).
Experiments
Since the syntactic parses of the tuning and test data contain 29 types of constituent labels and 35 types of POS tags, we have 29 types of XP+ features and 64 types of XP= features.
Related Work
The reordering rules were either manually designed (Collins et al., 2005; Wang et al., 2007; Xu et al., 2009; Lee et al., 2010) or automatically learned (Xia and McCord, 2004; Gen-zel, 2010; Visweswariah et al., 2010; Khalilov and Sima’an, 2011; Lerner and Petrov, 2013), using syntactic parses .
Related Work
(2012) obtained word order by using a reranking approach to reposition nodes in syntactic parse trees.
syntactic parsing is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Li, Sujian and Wang, Liang and Cao, Ziqiang and Li, Wenjie
Add arc <eC,ej> to GC with
The other two types of features which are related to length and syntactic parsing , only promote the performance slightly.
Add arc <eC,ej> to GC with
Since the RST tree is similar to the constituency based syntactic tree except that the constituent nodes are different, the syntactic parsing techniques have been borrowed for discourse parsing (Soricut and Marcu, 2003; Baldridge and Lascarides, 2005; Sagae, 2009; Hernault et al., 2010b; Feng and Hirst, 2012).
Introduction
Since such a hierarchical discourse tree is analogous to a constituency based syntactic tree except that the constituents in the discourse trees are text spans, previous researches have explored different constituency based syntactic parsing techniques (eg.
Introduction
First, it is difficult to design a set of production rules as in syntactic parsing , since there are no determinate generative rules for the interior text spans.
syntactic parsing is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Hermann, Karl Moritz and Das, Dipanjan and Weston, Jason and Ganchev, Kuzman
Abstract
We present a novel technique for semantic frame identification using distributed representations of predicates and their syntactic context; this technique leverages automatic syntactic parses and a generic set of word embeddings.
Discussion
combination of two syntactic parsers as input.
Frame Identification with Embeddings
Formally, let cc represent the actual sentence with a marked predicate, along with the associated syntactic parse tree; let our initial representation of the predicate context be Suppose that the word embeddings we start with are of dimension n. Then 9 is a function from a parsed sentence cc to Rm“, where k is the number of possible syntactic context types.
Overview
We could represent the syntactic context of runs as a vector with blocks for all the possible dependents warranted by a syntactic parser ; for example, we could assume that positions 0 .
syntactic parsing is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Zhang, Meishan and Zhang, Yue and Che, Wanxiang and Liu, Ting
Conclusions and Future Work
Our parser jointly performs word segmentation, POS tagging and syntactic parsing .
Experiments
The results also demonstrate that the annotated word structures are highly effective for syntactic parsing , giving an absolute improvement of 0.82% in phrase-structure parsing accuracy over the joint model with flat word structures.
Introduction
Words are treated as the atomic units in syntactic parsing , machine translation, question answering and other NLP tasks.
Introduction
In this paper, we investigate Chinese syntactic parsing with character—level information by extending the notation of phrase-structure
syntactic parsing is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Titov, Ivan and Klementiev, Alexandre
Empirical Evaluation
Table 1: Argument clustering performance with gold argument identification and gold syntactic parses on CoNLL 2008 shared-task dataset.
Empirical Evaluation
We report the results using gold argument identification and gold syntactic parses in order to focus the evaluation on the argument labeling stage and to minimize the noise due to automatic syntactic annotations.
Empirical Evaluation
Table 2: Results on CONLL 2009 with automatic argument identification and automatic syntactic parses .
Introduction
Learning in the context of multiple languages simultaneously has been shown to be beneficial to a number of NLP tasks from morphological analysis to syntactic parsing (Kuhn, 2004; Snyder and Barzilay, 2010; McDonald et al., 2011).
syntactic parsing is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Jiang, Long and Yu, Mo and Zhou, Ming and Liu, Xiaohua and Zhao, Tiejun
Approach Overview
In order to generate such features, much NLP work has to be done beforehand, such as tweet normalization, POS tagging, word stemming, and syntactic parsing .
Approach Overview
For syntactic parsing we use a Maximum Spanning Tree dependency parser (McDonald et al., 2005).
Introduction
For instance, in the second example, using syntactic parsing , we know that “Windows 7” is connected to “better” by a copula, while “Vista” is connected to “better” by a preposition.
Target-dependent Sentiment Classification
In this paper, we rely on the syntactic parse tree to satisfy this need.
syntactic parsing is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Liu, Kang and Xu, Liheng and Zhao, Jun
Introduction
To handle this problem, several methods exploited syntactic information, where several heuristic patterns based on syntactic parsing were designed (Popescu and Etzioni, 2005; Qiu et al., 2009; Qiu et al., 2011).
Introduction
Without using syntactic parsing , the noises from parsing errors can be effectively avoided.
Introduction
As mentioned in (Liu et al., 2013), using PSWAM can not only inherit the advantages of WAM: effectively avoiding noises from syntactic parsing errors when dealing with informal texts, but also can improve the mining performance by using partial supervision.
syntactic parsing is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Poon, Hoifung
Experiments
Upon manual inspection, many of the remaining errors are due to syntactic parsing errors that are too severe to fix.
Experiments
This is partly due to the fact that ATIS sentences are out of domain compared to the newswired text on which the syntactic parsers were trained.
Grounded Unsupervised Semantic Parsing
However, in complex sentences, syntax and semantic often diverge, either due to their differing goals or simply stemming from syntactic parsing errors.
syntactic parsing is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Biran, Or and McKeown, Kathleen
Abstract
In addition, we present results for a full system using additional features which achieves close to state of the art performance without resorting to gold syntactic parses or to context outside the relation.
Introduction
In addition, we present a system which combines these word pairs with additional features to achieve near state of the art performance without the use of syntactic parse features and of context outside the arguments of the relation.
Related Work
1Reliable syntactic parses are not always available in domains other than newswire, and context (preceding relations, especially explicit relations) is not always available in some applications such as generation and question answering.
syntactic parsing is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Bhat, Suma and Xue, Huichao and Yoon, Su-Youn
Conclusions
Seeking alternatives to measuring syntactic complexity of spoken responses via syntactic parsers , we study a shallow-analysis based approach for use in automatic scoring.
Related Work
Not surprisingly, Chen and Zechner (2011) studied measures of grammatical complexity via syntactic parsing and found that a Pearson’s correlation coefficient of 0.49 between syntactic complexity measures (derived from manual transcriptions) and proficiency scores, was drastically reduced to near nonexistence when the measures were applied to ASR word hypotheses.
Shallow-analysis approach to measuring syntactic complexity
The measures of syntactic complexity in this approach are POS bigrams and are not obtained by a deep analysis ( syntactic parsing ) of the structure of the sentence.
syntactic parsing is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Gormley, Matthew R. and Mitchell, Margaret and Van Durme, Benjamin and Dredze, Mark
Approaches
In Section 3.1, we introduced pipeline-trained models for SRL, which used grammar induction to predict unlabeled syntactic parses .
Related Work
In this simple pipeline, the first stage syntactically parses the corpus, and the second stage predicts semantic predicate-argument structure for each sentence using the labels of the first stage as features.
Related Work
In our low-resource pipelines, we assume that the syntactic parser is given no labeled parses—however, it may optionally utilize the semantic parses as distant supervision.
syntactic parsing is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Schwartz, Lane and Callison-Burch, Chris and Schuler, William and Wu, Stephen
Abstract
This paper describes a novel technique for incorporating syntactic knowledge into phrase-based machine translation through incremental syntactic parsing .
Introduction
Syntactic parsing may help produce more grammatical output by better modeling structural relationships and long-distance dependencies.
Introduction
We directly integrate incremental syntactic parsing into phrase-based translation.
syntactic parsing is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Riesa, Jason and Marcu, Daniel
Word Alignment as a Hypergraph
In this work, we parse our English data, and for each sentence E = e’f, let T be its syntactic parse .
Word Alignment as a Hypergraph
(1) that using the structure of l-best English syntactic parse trees is a reasonable way to frame and drive our search, and (2) that F-measure approximately decomposes over hyperedges.
Word Alignment as a Hypergraph
Initial alignments We can construct a word alignment hierarchically, bottom-up, by making use of the structure inherent in syntactic parse trees.
syntactic parsing is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Ma, Ji and Zhang, Yue and Zhu, Jingbo
Abstract
Experiment on the SANCL 2012 shared task show that our approach achieves 93.15% average tagging accuracy, which is the best accuracy reported so far on this data set, higher than those given by ensembled syntactic parsers .
Conclusion
For future work, we would like to investigate the two-phase approach to more challenging tasks, such as web domain syntactic parsing .
Introduction
set, higher than those given by ensembled syntactic parsers .
syntactic parsing is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Tsuruoka, Yoshimasa and Tsujii, Jun'ichi and Ananiadou, Sophia
Introduction
The applications range from simple classification tasks such as text classification and history-based tagging (Ratnaparkhi, 1996) to more complex structured prediction tasks such as part-of-speech (POS) tagging (Lafferty et al., 2001), syntactic parsing (Clark and Curran, 2004) and semantic role labeling (Toutanova et al., 2005).
Introduction
SGD was recently used for NLP tasks including machine translation (Tillmann and Zhang, 2006) and syntactic parsing (Smith and Eisner, 2008; Finkel et al., 2008).
Log-Linear Models
The model can be used for tasks like syntactic parsing (Finkel et al., 2008) and semantic role labeling (Cohn and Blunsom, 2005).
syntactic parsing is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: