Index of papers in Proc. ACL 2008 that mention
  • dependency tree
Gómez-Rodr'iguez, Carlos and Carroll, John and Weir, David
Dependency parsing schemata
In order to make the formalism general enough to include these parsers, we define items in terms of sets of partial dependency trees as shown in Figure 1.
Dependency parsing schemata
Such spans cannot be represented by a single dependency tree .
Dependency parsing schemata
Therefore, our formalism allows items to be sets of forests of partial dependency trees , instead of sets of trees.
dependency tree is mentioned in 18 sentences in this paper.
Topics mentioned in this paper:
Shen, Libin and Xu, Jinxi and Weischedel, Ralph
Introduction
Figure l: The dependency tree for sentence the boy will find it interesting
Introduction
1.2 Dependency Trees
Introduction
Dependency trees reveal long-distance relations between words.
String-to-Dependency Translation
In one kind, we keep dependency trees with a sub-root, where all the children of the sub-root are complete.
String-to-Dependency Translation
Figure 5: A dependency tree with flexible combination
String-to-Dependency Translation
Figure 1 shows a traditional dependency tree .
dependency tree is mentioned in 18 sentences in this paper.
Topics mentioned in this paper:
Wang, Qin Iris and Schuurmans, Dale and Lin, Dekang
Dependency Parsing Model
Given a sentence X = (x1, ..., sun) (cc,- denotes each word in the sentence), we are interested in computing a directed dependency tree , Y, over X.
Dependency Parsing Model
We assume that a directed dependency tree Y consists of ordered pairs (sci —> any) of words in X such that each word appears in at least one pair and each word has in-degree at most one.
Dependency Parsing Model
Dependency trees are assumed to be projective here, which means that if there is an arc (cc,- —> 553-), then :0,- is an ancestor of all the words
Experimental Results
For experiment on English, we used the English Penn Treebank (PTB) (Marcus et al., 1993) and the constituency structures were converted to dependency trees using the same rules as (Yamada and Matsumoto, 2003).
Introduction
Figure l: A dependency tree
Semi-supervised Convex Training for Structured SVM
As mentioned in Section 3, a dependency tree Yj is represented as an adjacency matrix.
Semi-supervised Convex Training for Structured SVM
Thus we need to enforce some constraints in the adjacency matrix to make sure that each Yj satisfies the dependency tree constraints.
Supervised Structured Large Margin Training
1We assume all the dependency trees are projective in our work (just as some other researchers do), although in the real word, most languages are non-projective.
Supervised Structured Large Margin Training
We represent a dependency tree as a k x k adjacency matrix.
dependency tree is mentioned in 9 sentences in this paper.
Topics mentioned in this paper:
Cherry, Colin
Abstract
We add syntax to this process with a cohesion constraint based on a dependency tree for the source sentence.
Cohesive Decoding
The decoder stores the flat sentence in the original sentence data structure, and the head-encoded dependency tree in an attached tree data structure.
Cohesive Phrasal Output
Next, we introduce our source dependency tree T. Each source token e,- is also a node in T. We define T(ei) to be the subtree of T rooted at 61-.
Cohesive Phrasal Output
spanS 6-, T, am 2 min a -, max ak, ( z 1 > {j|€j€T(€i)} j {k|€k€T(€i)} Consider the simple phrasal translation shown in Figure 1 along with a dependency tree for the English source.
Conclusion
This algorithm was used to implement a soft cohesion constraint for the Moses decoder, based on a source-side dependency tree .
Experiments
Since we require source dependency trees , all experiments test English to French translation.
Experiments
English dependency trees are provided by Minipar (Lin, 1994).
dependency tree is mentioned in 7 sentences in this paper.
Topics mentioned in this paper:
Miyao, Yusuke and Saetre, Rune and Sagae, Kenji and Matsuzaki, Takuya and Tsujii, Jun'ichi
Evaluation Methodology
For the protein pair IL-8 and CXCR1 in Figure 4, a dependency parser outputs a dependency tree shown in Figure 1.
Evaluation Methodology
From this dependency tree , we can extract a dependency path shown in Figure 5, which appears to be a strong clue in knowing that these proteins are mentioned as interacting.
Evaluation Methodology
CoNLL The dependency tree format used in the 2006 and 2007 CoNLL shared tasks on dependency parsing.
Syntactic Parsers and Their Representations
Figure 1 shows a dependency tree for the sentence “IL-8 recognizes and activates CXCRl.” An advantage of dependency parsing is that dependency trees are a reasonable approximation of the semantics of sentences, and are readily usable in NLP applications.
Syntactic Parsers and Their Representations
Figure l: CoNLL-X dependency tree
dependency tree is mentioned in 6 sentences in this paper.
Topics mentioned in this paper:
Mírovský, Jiří
Introduction
The analytical layer roughly corresponds to the surface syntax of the sentence; the annotation is a single-rooted dependency tree with labeled nodes.
Introduction
Again, the annotation is a dependency tree with labeled nodes (Hajicova 1998).
Phenomena and Requirements
The representation of the tectogrammatical annotation of a sentence is a rooted dependency tree .
dependency tree is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: