Index of papers in Proc. ACL 2010 that mention
  • context information
Qazvinian, Vahed and Radev, Dragomir R.
Abstract
In this paper, we propose a general framework based on probabilistic inference to extract such context information from scientific papers.
Abstract
Our experiments show greater pyramid scores for surveys generated using such context information rather than citation sentences alone.
Conclusion
Our experiments on generating surveys for Question Answering and Dependency Parsing show how surveys generated using such context information along with citation sentences have higher quality than those built using citations alone.
Conclusion
Our future goal is to combine summarization and bibliometric techniques towards building automatic surveys that employ context information as an important part of the generated surveys.
Introduction
We refer to such implicit citations that contain information about a specific secondary source but do not explicitly cite it, as sentences with context information or context sentences for short.
Proposed Method
In this section we propose our methodology that enables us to identify the context information of a cited paper.
Proposed Method
To find the sentences from a paper that form the context information of a given cited paper, we build an MRF in which a hidden node :13,- and an observed node y,- correspond to each sentence.
context information is mentioned in 7 sentences in this paper.
Topics mentioned in this paper:
Sun, Xu and Gao, Jianfeng and Micol, Daniel and Quirk, Chris
A Phrase-Based Error Model
Rather than replacing single words in isolation, this model replaces sequences of words with sequences of words, thus incorporating contextual information .
A Phrase-Based Error Model
Notice that when we set L=1, the phrase-based error model is reduced to a word-based error model which assumes that words are transformed independently from C to Q, without taking into account any contextual information .
Introduction
Comparing to traditional error models that account for transformation probabilities between single characters (Kernighan et al., 1990) or sub-word strings (Brill and Moore, 2000), the phrase-based model is more powerful in that it captures some contextual information by retaining inter-term dependencies.
Introduction
We show that this information is crucial to detect the correction of a query term, because unlike in regular written text, any query word can be a valid search term and in many cases the only way for a speller system to make the judgment is to explore its usage according to the contextual information .
Related Work
Typically, a language model (source model) is used to capture contextual information, while an error model (channel model) is considered to be context free in that it does not take into account any contextual information in modeling word transformation probabilities.
Related Work
In this study we argue that it is beneficial to capture contextual information in the error model.
context information is mentioned in 6 sentences in this paper.
Topics mentioned in this paper:
Thater, Stefan and Fürstenau, Hagen and Pinkal, Manfred
Conclusion
Another direction for further study will be the generalization of our model to larger syntactic contexts, including more than only the direct neighbors in the dependency graph, ultimately incorporating context information from the whole sentence in a recursive fashion.
Experiments: Ranking Paraphrases
3Note that the context information is the same for both words.
Experiments: Ranking Paraphrases
The main difference between verbs on the one hand, and nouns, adjectives, and adverbs on the other hand, is that verbs typically come with a rich context—subject, object, and so on—while non-verbs often have either no dependents at all or only closed class dependents such as determiners which provide only limited contextual informations , if any at all.
The model
A more flexible approach than simple filtering, however, is to re-weight those dimensions with context information .
context information is mentioned in 4 sentences in this paper.
Topics mentioned in this paper: