A Sentence Trimmer with CRFs | In the context of sentence compression , a linear programming based approach such as Clarke and Lapata (2006) is certainly one that deserves consideration. |
A Sentence Trimmer with CRFs | 2Note that a sentence compression can be represented as an array of binary labels, one of them marking words to be retained in compression and the other those to be dropped. |
Conclusions | This paper introduced a novel approach to sentence compression in Japanese, which combines a syntactically motivated generation model and CRFs, in or- |
Introduction | For better or worse, much of prior work on sentence compression (Riezler et al., 2003; McDonald, 2006; Turner and Charniak, 2005) turned to a single corpus developed by Knight and Marcu (2002) (K&M, henceforth) for evaluating their approaches. |
Introduction | Despite its limited scale, prior work in sentence compression relied heavily on this particular corpus for establishing results (Turner and Charniak, 2005; McDonald, 2006; Clarke and Lapata, 2006; Galley and McKeown, 2007). |
Introduction | An obvious benefit of using CRFs for sentence compression is that the model provides a general (and principled) probabilistic framework which permits information from various sources to be integrated towards compressing sentence, a property K&M do not share. |
The Dependency Path Model | In what follows, we will describe somewhat in detail a prior approach to sentence compression in Japanese which we call the ”dependency path model,” or DPM. |