Index of papers in Proc. ACL 2010 that mention
  • dependency parsing
Jiang, Wenbin and Liu, Qun
Abstract
In this paper we describe an intuitionistic method for dependency parsing , where a classifier is used to determine whether a pair of words forms a dependency edge.
Abstract
Experiments show that, the classifier trained on the projected classification instances significantly outperforms previous projected dependency parsers .
Abstract
More importantly, when this classifier is integrated into a maximum spanning tree (MST) dependency parser , obvious improvement is obtained over the MST baseline.
Introduction
Supervised dependency parsing achieves the state-of-the-art in recent years (McDonald et al., 2005a; McDonald and Pereira, 2006; Nivre et al., 2006).
Introduction
For example, the unsupervised dependency parsing (Klein and Manning, 2004) which is totally based on unannotated data, and the semisupervised dependency parsing (Koo et al., 2008) which is based on both annotated and unannotated data.
Introduction
Meanwhile, we propose an intuitionistic model for dependency parsing , which uses a classifier to determine whether a pair of words form a dependency edge.
dependency parsing is mentioned in 24 sentences in this paper.
Topics mentioned in this paper:
Sassano, Manabu and Kurohashi, Sadao
Abstract
We investigate active learning methods for Japanese dependency parsing .
Abstract
Experimental results show that our proposed methods improve considerably the leam-ing curve of Japanese dependency parsing .
Active Learning for Japanese Dependency Parsing
4We did not employ query-by-committee (QBC) (Seung et al., 1992), which is another important general framework of active learning, since the selection strategy with large margin classifiers (Section 2.2) is much simpler and seems more practical for active learning in Japanese dependency parsing with smaller constituents.
Experimental Evaluation and Discussion
We set the degree of the kernels to 3 since cubic kernels with SVM have proved effective for Japanese dependency parsing (Kudo and Matsumoto, 2000; Kudo and Matsumoto, 2002).
Experimental Evaluation and Discussion
There are features that have been commonly used for Japanese dependency parsing among related papers, e.g., (Kudo and Matsumoto, 2002; Sassano, 2004; Iwatate et al., 2008).
Experimental Evaluation and Discussion
It is observed that active learning with large margin classifiers also works well for Sassano’s algorithm of Japanese dependency parsing .
Introduction
We use Japanese dependency parsing as a target task in this study since a simple and efficient algorithm of parsing is proposed and, to our knowledge, active learning for Japanese dependency parsing has never been studied.
Introduction
In Section 5 we describe our proposed methods and others of active learning for Japanese dependency parsing .
Japanese Parsing
3.3 Algorithm of Japanese Dependency Parsing
Japanese Parsing
We use Sassano’s algorithm (Sassano, 2004) for Japanese dependency parsing .
Japanese Parsing
Figure 3: Algorithm of Japanese dependency parsing
dependency parsing is mentioned in 14 sentences in this paper.
Topics mentioned in this paper:
Koo, Terry and Collins, Michael
Abstract
We present algorithms for higher-order dependency parsing that are “third-order” in the sense that they can evaluate substructures containing three dependencies, and “efficient” in the sense that they require only O(n4) time.
Conclusion
A second area for future work lies in applications of dependency parsing .
Dependency parsing
For a sentence :13, we define dependency parsing as a search for the highest-scoring analysis of :13:
Existing parsing algorithms
Our new third-order dependency parsers build on ideas from existing parsing algorithms.
Introduction
Consequently, recent work in dependency parsing has been restricted to applications of second-order parsers, the most powerful of which (Carreras, 2007) requires 0(n4) time and 0(n3) space, while being limited to second-order parts.
New third-order parsing algorithms
In this section we describe our new third-order dependency parsing algorithms.
New third-order parsing algorithms
As a final note, the parsing algorithms described in this section fall into the category of projective dependency parsers , which forbid crossing dependencies.
Parsing experiments
Following standard practice for higher-order dependency parsing (McDonald and Pereira, 2006; Carreras, 2007), Models 1 and 2 evaluate not only the relevant third-order parts, but also the lower-order parts that are implicit in their third-order factorizations.
Parsing experiments
For example, Model 1 defines feature mappings for dependencies, siblings, grandchildren, and grand-siblings, so that the score of a dependency parse is given by:
Related work
Eisner (2000) defines dependency parsing models where each word has a set of possible “senses” and the parser recovers the best joint assignment of syntax and senses.
dependency parsing is mentioned in 10 sentences in this paper.
Topics mentioned in this paper:
Chen, Wenliang and Kazama, Jun'ichi and Torisawa, Kentaro
Abstract
This paper proposes a dependency parsing method that uses bilingual constraints to improve the accuracy of parsing bilingual texts (bitexts).
Dependency parsing
For dependency parsing , there are two main types of parsing models (Nivre and McDonald, 2008; Nivre and Kubler, 2006): transition-based (Nivre, 2003; Yamada and Matsumoto, 2003) and graph-based (McDonald et al., 2005; Carreras, 2007).
Dependency parsing
Figure 3 shows an example of dependency parsing .
Experiments
Table 2: Dependency parsing results of Chinese-source case
Experiments
Table 3: Dependency parsing results of English-source case
Introduction
This paper proposes a dependency parsing method, which uses the bilingual constraints that we call bilingual subtree constraints and statistics concerning the constraints estimated from large unlabeled monolingual corpora.
Introduction
The result is used as additional features for the source side dependency parser .
Introduction
Section 3 introduces the background of dependency parsing .
dependency parsing is mentioned in 8 sentences in this paper.
Topics mentioned in this paper:
Gómez-Rodr'iguez, Carlos and Nivre, Joakim
Abstract
Finding a class of structures that is rich enough for adequate linguistic representation yet restricted enough for efficient computational processing is an important problem for dependency parsing .
Introduction
One of the unresolved issues in this area is the proper treatment of non-projective dependency trees, which seem to be required for an adequate representation of predicate-argument structure, but which undermine the efficiency of dependency parsing (Neuhaus and Broker, 1997; Buch-Kromann, 2006; McDonald and Satta, 2007).
Introduction
This was originally proposed by Yli-Jyr'a (2003) but has so far played a marginal role in the dependency parsing literature, because no algorithm was known for determining whether an arbitrary tree was m-planar, and no parsing algorithm existed for any constant value of m. The contribution of this paper is twofold.
Parsing 1-Planar Structures
In the transition-based framework of Nivre (2008), a deterministic dependency parser is defined by a nondeterministic transition system, specifying a set of elementary operations that can be executed during the parsing process, and an oracle that deterministically selects a single transition at each choice point of the parsing process.
Parsing 1-Planar Structures
A transition system for dependency parsing is a quadruple S = (C, T, cs, Ct) where I.
Preliminaries
For reasons of computational efficiency, many dependency parsers are restricted to work with projective dependency structures, that is, forests in which the projection of each node corresponds to a contiguous substring of the input:
Preliminaries
The concept of planarity on its own does not seem to be very relevant as an extension of projectivity for practical dependency parsing .
dependency parsing is mentioned in 7 sentences in this paper.
Topics mentioned in this paper:
Park, Keun Chan and Jeong, Yoonjae and Myaeng, Sung Hyon
Experience Detection
The dependency parser is used to ensure a modal marker is indeed associated with the main predicate.
Experience Detection
In order to make a distinction, we use the dependency parser and a named-entity recognizer (Finkel et al., 2005) that can recognize person pronouns and person names.
Experience Detection
We used the dependency parser for extracting objective cases using the direct object relation.
Lexicon Construction
For a POS and grammatical check of a candidate sentence, we used the Stanford POS tagger (Toutanova et al., 2003) and Stanford dependency parser (Klein and Manning, 2003).
dependency parsing is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Qazvinian, Vahed and Radev, Dragomir R.
Conclusion
Our experiments on generating surveys for Question Answering and Dependency Parsing show how surveys generated using such context information along with citation sentences have higher quality than those built using citations alone.
Impact on Survey Generation
that contains two sets of cited papers and corresponding citing sentences, one on Question Answering (QA) with 10 papers and the other on Dependency Parsing (DP) with 16 papers.
Introduction
lBuchholz and Marsi “CoNLL-X Shared Task On Multilingual Dependency Parsing” , CoNLL 2006
dependency parsing is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: