Index of papers in Proc. ACL 2012 that mention
  • graph-based
Chen, Wenliang and Zhang, Min and Li, Haizhou
Abstract
Most previous graph-based parsing models increase decoding complexity when they use high-order features due to exact-inference decoding.
Abstract
In this paper, we present an approach to enriching high—order feature representations for graph-based dependency parsing models using a dependency language model and beam search.
Experiments
Table 7 shows the performance of the graph-based systems that were compared, where McDonald06 refers to the second-order parser of McDonald
Implementation Details
We implement our parsers based on the MSTParserl, a freely available implementation of the graph-based model proposed by (McDonald and Pereira, 2006).
Introduction
Among them, graph-based dependency parsing models have achieved state-of-the-art performance for a wide range of Ian-guages as shown in recent CoNLL shared tasks
Introduction
In the graph-based models, dependency parsing is treated as a structured prediction problem in which the graphs are usually represented as factored structures.
Introduction
How to enrich high-order feature representations without increasing the decoding complexity for graph-based models becomes a very challenging problem in the dependency parsing task.
Parsing with dependency language model
3.1 Graph-based parsing model
Parsing with dependency language model
The graph-based parsing model aims to search for the maximum spanning tree (MST) in a graph (McDonald et al., 2005).
graph-based is mentioned in 13 sentences in this paper.
Topics mentioned in this paper:
Kim, Seokhwan and Lee, Gary Geunbae
Abstract
This paper proposes a novel graph-based projection approach and demonstrates the merits of it by using a Korean relation extraction system based on projected dataset from an English—Korean parallel corpus.
Cross-lingual Annotation Projection for Relation Extraction
To solve both of these problems at once, we propose a graph-based projection approach for relation extraction.
Graph Construction
The most crucial factor in the success of graph-based learning approaches is how to construct a graph that is appropriate for the target task.
Graph Construction
Das and Petrov (Das and Petrov, 2011) proposed a graph-based bilingual projection of part-of-speech tagging by considering the tagged words in the source language as labeled examples and connecting them to the unlabeled words in the target language, while referring to the word alignments.
Graph Construction
The graph for our graph-based projection is constructed by connecting related vertex pairs by weighted edges.
Implementation
To demonstrate the effectiveness of the graph-based projection approach for relation extraction, we developed a Korean relation extraction system that was trained with projected annotations from English resources.
Implementation
Table 1: Comparison between direct and graph-based projection approaches to extract semantic relationships for four relation types
Implementation
The graph-based projection was performed by the Junto toolkit 4 with the maximum number of iterations of 10 for each execution.
Introduction
In this paper, we propose a graph-based projection approach for weakly supervised relation extraction.
Introduction
The goal of our graph-based approach is to improve the robustness of the extractor with respect to errors that are generated and accumulated by preprocessors.
Label Propagation
To induce labels for all of the unlabeled vertices on the graph constructed in Section 3, we utilize the label propagation algorithm (Zhu and Ghahramani, 2002), which is a graph-based semi-supervised learning algorithm.
graph-based is mentioned in 16 sentences in this paper.
Topics mentioned in this paper:
Kolomiyets, Oleksandr and Bethard, Steven and Moens, Marie-Francine
Abstract
We compare two parsing models for temporal dependency structures, and show that a deterministic non-projective dependency parser outperforms a graph-based maximum spanning tree parser, achieving labeled attachment accuracy of 0.647 and labeled tree edit distance of 0.596.
Discussion and Conclusions
Comparing the two dependency parsing models, we have found that a shift-reduce parser, which more closely mirrors the incremental processing of our human annotators, outperforms a graph-based maximum spanning tree parser.
Evaluations
Table 2: Features for the shift-reduce parser (SRP) and the graph-based maximum spanning tree (MST) parser.
Evaluations
The Shift-Reduce parser (SRP; Section 4.1) and the graph-based , maximum spanning tree parser (MST; Section 4.2) are compared to these baselines.
Evaluations
It has been argued that graph-based models like the maximum spanning tree parser should be able to produce more globally consistent and correct dependency trees, yet we do not observe that here.
Feature Design
The shift-reduce parser (SRP) trains a machine learning classifier as the oracle 0 E (C —> T) to predict a transition 75 from a parser configuration 0 2 (L1, L2, Q, E), using node features such as the heads of L1, L2 and Q, and edge features from the already predicted temporal relations in E. The graph-based maximum spanning tree (MST) parser trains a machine learning model to predict SCORE(e) for an edge e = (107;, rj, wk), using features of the nodes w, and wk.
Parsing Models
We consider two different approaches to learning a temporal dependency parser: a shift-reduce model (Nivre, 2008) and a graph-based model (McDonald et al., 2005).
Parsing Models
4.2 Graph-Based Parsing Model
Parsing Models
Graph-based models are an alternative dependency parsing model, which assembles a graph with weighted edges between all pairs of words, and selects the tree-shaped subset of this graph that gives the highest total score (Fig.
graph-based is mentioned in 11 sentences in this paper.
Topics mentioned in this paper:
Liu, Shujie and Li, Chi-Ho and Li, Mu and Zhou, Ming
Abstract
We convert such graph-based translation consensus from similar source strings into useful features both for n-best output re-ranking and for decoding algorithm.
Graph-based Structured Learning
In general, a graph-based model assigns labels to instances by considering the labels of similar instances.
Graph-based Structured Learning
The gist of graph-based model is that, if two instances are connected by a strong edge, then their labels tend to be the same (Zhu, 2005).
Graph-based Structured Learning
This scenario differs from the general case of graph-based model in two aspects.
Graph-based Translation Consensus
Our MT system with graph-based translation consensus adopts the conventional log-linear model.
Graph-based Translation Consensus
Based on the commonly used features, two kinds of feature are added to equation (1), one is graph-based consensus features, which are about consensus among the translations of similar sentences/spans; the other is local consensus features, which are about consensus among the translations of the same sentence/span.
Introduction
Alexandrescu and Kirchhoff (2009) proposed a graph-based semi-supervised model to re-rank n-best translation output.
Introduction
In this paper, we attempt to leverage translation consensus among similar (spans of) source sentences in bilingual training data, by a novel graph-based model of translation consensus.
graph-based is mentioned in 30 sentences in this paper.
Topics mentioned in this paper: