Abstract | The resulting shift-reduce discourse parser obtains substantial improvements over the previous state-of-the-art in predicting relations and nuclearity on the RST Treebank. |
Introduction | Our method is implemented as a shift-reduce discourse parser (Marcu, 1999; Sagae, 2009). |
Model | 2.1 Shift-reduce discourse parsing |
Model | We construct RST Trees using shift-reduce parsing, as first proposed by Marcu (1999). |
Model | C Total number of classes, which correspond to possible shift-reduce operations |
Abstract | This paper presents the first dependency model for a shift-reduce CCG parser. |
Introduction | In this paper, we fill a gap in the literature by developing the first dependency model for a shift-reduce CCG parser. |
Introduction | Shift-reduce parsing applies naturally to CCG (Zhang and Clark, 2011), and the left-to-right, incremental nature of the decoding fits with CCG’s cognitive claims. |
Introduction | Results on the standard CCGBank tests show that our parser achieves absolute labeled F-score gains of up to 0.5 over the shift-reduce parser of Zhang and Clark (2011); and up to 1.05 and 0.64 over the normal-form and hybrid models of Clark and Curran (2007), respectively. |
Shift-Reduce with Beam-Search | This section describes how shift-reduce techniques can be applied to CCG, following Zhang and Clark (2011). |
Shift-Reduce with Beam-Search | First we describe the deterministic process which a parser would follow when tracing out a single, correct derivation; then we describe how a model of normal-form derivations — or, more accurately, a sequence of shift-reduce actions leading to a normal-form derivation —can be used with beam-search to develop a nondeterministic parser which selects the highest scoring sequence of actions. |
Shift-Reduce with Beam-Search | Note this section only describes a normal-form derivation model for shift-reduce parsing. |