Index of papers in Proc. ACL 2013 that mention
  • shift-reduce
Liu, Yang
Abstract
We introduce a shift-reduce parsing algorithm for phrase-based string-to-dependency translation.
Abstract
To resolve conflicts in shift-reduce parsing, we propose a maximum entropy model trained on the derivation graph of training data.
Introduction
In this paper, we propose a shift-reduce parsing algorithm for phrase-based string-to-dependency translation.
Introduction
4. exploiting syntactic information: as the shift-reduce parsing algorithm generates target language dependency trees in decoding, dependency language models (Shen et al., 2008; Shen et al., 2010) can be used to encourage linguistically-motivated reordering.
Introduction
2 Shift-Reduce Parsing for Phrase-based String-to-Dependency Translation
shift-reduce is mentioned in 27 sentences in this paper.
Topics mentioned in this paper:
Zhu, Muhua and Zhang, Yue and Chen, Wenliang and Zhang, Min and Zhu, Jingbo
Abstract
Shift-reduce dependency parsers give comparable accuracies to their chart-based counterparts, yet the best shift-reduce constituent parsers still lag behind the state-of-the-art.
Abstract
One important reason is the existence of unary nodes in phrase structure trees, which leads to different numbers of shift-reduce actions between different outputs for the same input.
Abstract
We propose a simple yet effective extension to the shift-reduce process, which eliminates size differences between action sequences in beam-search.
Baseline parser
We adopt the parser of Zhang and Clark (2009) for our baseline, which is based on the shift-reduce process of Sagae and Lavie (2005), and employs global perceptron training and beam search.
Baseline parser
2.1 Vanilla Shift-Reduce
Baseline parser
Shift-reduce parsing is based on a left-to-right scan of the input sentence.
Introduction
Transition-based parsers employ a set of shift-reduce actions and perform parsing using a sequence of state transitions.
Introduction
We propose an extension to the shift-reduce process to address this problem, which gives significant improvements to the parsing accuracies.
Semi-supervised Parsing with Large Data
This section discusses how to extract information from unlabeled data or auto-parsed data to further improve shift-reduce parsing accuracies.
Semi-supervised Parsing with Large Data
Based on the information, we propose a set of novel features specifically designed for shift-reduce constituent parsing.
shift-reduce is mentioned in 17 sentences in this paper.
Topics mentioned in this paper:
Zhang, Meishan and Zhang, Yue and Che, Wanxiang and Liu, Ting
Introduction
With regard to task of parsing itself, an important advantage of the character-level syntax trees is that they allow word segmentation, part-of-speech (POS) tagging and parsing to be performed jointly, using an efficient CKY-style or shift-reduce algorithm.
Introduction
Our model is based on the discriminative shift-reduce parser of Zhang and Clark (2009; 2011), which is a state-of-the-art word-based phrase-structure parser for Chinese.
Introduction
We extend their shift-reduce framework, adding more transition actions for word segmentation and POS tagging, and defining novel features that capture character information.
Related Work
Our work is based on the shift-reduce operations of their work, while we introduce additional operations for segmentation and POS tagging.
shift-reduce is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Setiawan, Hendra and Zhou, Bowen and Xiang, Bing and Shen, Libin
Decoding
The algorithm bears a close resemblance to the shift-reduce algorithm where a stack is used to accumulate (partial) information about a, M L and M R for each a E A in the derivation.
Introduction
We implement an efficient shift-reduce algorithm that facilitates the accumulation of partial context in a bottom-up fashion, allowing our model to influence the translation process even in the absence of full context.
Introduction
In Section 6, we describe our shift-reduce algorithm which inte-
shift-reduce is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: