Index of papers in Proc. ACL 2013 that mention
  • beam search
Choi, Jinho D. and McCallum, Andrew
Abstract
We present a novel approach, called selectional branching, which uses confidence estimates to decide when to employ a beam, providing the accuracy of beam search at speeds close to a greedy transition-based dependency parsing approach.
Abstract
Selectional branching is guaranteed to perform a fewer number of transitions than beam search yet performs as accurately.
Abstract
With the standard setup, our parser shows an unlabeled attachment score of 92.96% and a parsing speed of 9 milliseconds per sentence, which is faster and more accurate than the current state-of—the-art transition-based parser that uses beam search .
Introduction
Greedy transition-based dependency parsing has been widely deployed because of its speed (Cer et a1., 2010); however, state-of—the-art accuracies have been achieved by globally optimized parsers using beam search (Zhang and Clark, 2008; Huang and Sagae, 2010; Zhang and Nivre, 2011; Bohnet and Nivre, 2012).
Introduction
Coupled with dynamic programming, transition-based dependency parsing with beam search can be done very efficiently and gives significant improvement to parsing accuracy.
Introduction
One downside of beam search is that it always uses a fixed size of beam even when a smaller size of beam is sufficient for good results.
Selectional branching
For transition-based parsing, state-of-the-art accuracies have been achieved by parsers optimized on multiple transition sequences using beam search,
Selectional branching
Compared to beam search that always performs b - t transitions, selectional branching is guaranteed to perform fewer transitions given the same beam size because d g b and e > 0 except for d = 1, in which case, no branching happens.
Selectional branching
higher parsing accuracy than the current state-of-the-art transition-based parser using beam search , and performs about 3 times faster.
beam search is mentioned in 19 sentences in this paper.
Topics mentioned in this paper:
Ma, Ji and Zhu, Jingbo and Xiao, Tong and Yang, Nan
Abstract
In this paper, we combine easy-first dependency parsing and POS tagging algorithms with beam search and structured perceptron.
Abstract
The proposed solution can also be applied to combine beam search and structured perceptron with other systems that exhibit spurious ambiguity.
Bk <— BESTS(X,Bk_1,W)
3 Easy-first with beam search
Bk <— BESTS(X,Bk_1,W)
In this section, we introduce easy-first with beam search in our own notations that will be used throughout the rest of this paper.
Bk <— BESTS(X,Bk_1,W)
Following (Huang et al., 2012), in order to formalize beam search , we also use the argtopffleyw - <p(y) operation which returns the top s action sequences in Y according to w - (p(y).
Easy-first dependency parsing
Algorithm 1: Easy-first with beam search
Introduction
To enlarge the search space, a natural extension to greedy search is beam search .
Introduction
Recent work also shows that beam search together with perceptron-based global learning (Collins, 2002) enable the use of nonlocal features that are helpful to improve parsing performance without overfitting (Zhang and Nivre, 2012).
Introduction
Due to these advantages, beam search and global learning has been applied to many NLP tasks (Collins and
beam search is mentioned in 16 sentences in this paper.
Topics mentioned in this paper:
Wang, Lu and Raghavan, Hema and Castelli, Vittorio and Florian, Radu and Cardie, Claire
Abstract
An innovative beam search decoder is proposed to efficiently find highly probable compressions.
Results
Furthermore, our HEAD-driven beam search method with MULTI-scorer beats all systems on DUC 20066 and all systems on DUC 2007 except the best system in terms of R-2 (p < 0.01).
Results
Four native speakers who are undergraduate students in computer science (none are authors) performed the task, We compare our system based on HEAD-driven beam search with MULTI-scorer to the best systems in DUC 2006 achieving top ROUGE scores (Jagarlamudi et al., 2006) (Best DUC system (ROUGE)) and top linguistic quality scores (Lacatusu et al., 2006) (Best DUC system (LQ))7.
Sentence Compression
As the space of possible compressions is exponential in the number of leaves in the parse tree, instead of looking for the globally optimal solution, we use beam search to find a set of highly likely compressions and employ a language model trained on a large corpus for evaluation.
Sentence Compression
A Beam Search Decoder.
Sentence Compression
The beam search decoder (see Algorithm 1) takes as input the sentence’s parse tree T = 750751 .
beam search is mentioned in 9 sentences in this paper.
Topics mentioned in this paper:
Wu, Yuanbin and Ng, Hwee Tou
Experiments
We compare our ILP approach with two other systems: the beam search decoder of (Dahlmeier and Ng, 2012a) which achieves the best published performance to date on the H00 2011 data set, and UI Runl (Rozovskaya et al., 2011) which achieves the best performance among all participating systems at the H00 2011 shared task.
Experiments
P R F P R F UI Runl 40.86 11.21 17.59 54.61 14.57 23.00 Beam search 30.28 19.17 23.48 33.59 20.53 25.48 ILP 20.54 27.93 23.67 21.99 29.04 25.03
Experiments
The results of the beam search decoder and UI Run1 are taken from Table 2 of (Dahlmeier and Ng, 2012a).
beam search is mentioned in 6 sentences in this paper.
Topics mentioned in this paper:
Li, Qi and Ji, Heng and Huang, Liang
Introduction
Therefore we employ beam search in decoding, and train the model using the early-update perceptron variant tailored for beam search (Collins and Roark, 2004; Huang et al., 2012).
Joint Framework for Event Extraction
3.1 Structured perceptron with beam search
Joint Framework for Event Extraction
Figure 2 describes the skeleton of perceptron training algorithm with beam search .
Joint Framework for Event Extraction
In each step of the beam search , if the prefix of oracle assignment 3/ falls out from the beam, then the top result in the beam is returned for early update.
Related Work
In comparison, our approach is a unified framework based on beam search , which allows us to exploit arbitrary global features efficiently.
beam search is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Socher, Richard and Bauer, John and Manning, Christopher D. and Andrew Y., Ng
Introduction
The subsequent section will then describe a bottom-up beam search and its approximation for finding the optimal tree.
Introduction
One could use a bottom-up beam search , keeping a k-best list at every cell of the chart, possibly for each syntactic category.
Introduction
This beam search inference procedure is still considerably slower than using only the simplified base PCFG, especially since it has a small state space (see next section for
beam search is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Özbal, Gözde and Pighin, Daniele and Strapparava, Carlo
Architecture of BRAINSUP
A beam search in the space of all possible lexicalizations of a syntactic pattern promotes the words with the highest likelihood of satisfying the user specification.
Architecture of BRAINSUP
CompatiblePatterns(-) finds the most frequent syntactic patterns in ’P that are compatible with the user specification, as explained in Section 3.1; FillInPattern(-) carries out the beam search , and returns the best solutions generated for each pattern p given U.
Architecture of BRAINSUP
With the compatible patterns selected, we can initiate a beam search in the space of all possible lexicalizations of the patterns, i.e., the space of all sentences that can be generated by respecting the syntactic constraints encoded by each pattern.
beam search is mentioned in 4 sentences in this paper.
Topics mentioned in this paper: