Index of papers in Proc. ACL 2014 that mention
  • context information
Sun, Le and Han, Xianpei
Introduction
The feature we used includes characteristics of relation instance, phrase properties and context information (See Section 3 for details).
Introduction
3.3 Context Information Feature
Introduction
The context information of a phrase node is critical for identifying the role and the importance of a subtree in the whole relation instance.
context information is mentioned in 10 sentences in this paper.
Topics mentioned in this paper:
Tang, Duyu and Wei, Furu and Yang, Nan and Zhou, Ming and Liu, Ting and Qin, Bing
Related Work
SSWE outperforms MVSA by exploiting more contextual information in the sentiment predictor function.
Related Work
Among three sentiment-specific word embeddings, SSWEu captures more context information and yields best performance.
Related Work
SSWE outperforms MVSA and ReEmb by exploiting more context information of words and sentiment information of sentences, respectively.
context information is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
van Gompel, Maarten and van den Bosch, Antal
Baselines
A context-insensitive yet informed baseline was constructed to assess the impact of L2 context information in translating Ll fragments.
Data preparation
Nevertheless, we hope to show that our automated way of test set generation is sufficient to test the feasibility of our core hypothesis that L1 fragments can be translated to L2 using L2 context information .
Introduction
The main research question in this research is how to disambiguate an L1 word or phrase to its L2 translation based on an L2 context, and whether such cross-lingual contextual approaches provide added value compared to baseline models that are not context informed or compared to standard language models.
System
If so, we are done quickly and need not rely on context information .
context information is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Xiong, Deyi and Zhang, Min
Related Work
Rather than predicting word senses for ambiguous words, the reformulated WSD directly predicts target translations for source words with context information .
Related Work
Lexical selection Our work is also related to lexical selection in SMT where appropriate target lexical items for source words are selected by a statistical model with context information (Bangalore et al., 2007; Mauser et al., 2009).
Sense-Based Translation Model
The sense-based translation model estimates the probability that a source word 0 is translated into a target phrase 6 given contextual information , including word senses that are obtained using the HDP-based WSI as described in the last section.
WSI-Based Broad-Coverage Sense Tagger
A pseudo document is composed of either a bag of neighboring words of a word token, or the Part-to-Speech tags of neighboring words, or other contextual information elements.
context information is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Muller, Philippe and Fabre, Cécile and Adam, Clémentine
Abstract
We first set up a human annotation of semantic links with or without contextual information to show the importance of the textual context in evaluating the relevance of semantic similarity, and to assess the prevalence of actual semantic relations between word tokens.
Evaluation of lexical similarity in context
To verify that this methodology is useful, we did a preliminary annotation to contrast judgment on lexical pairs with or without this contextual information .
Introduction
We present the experiments we set up to automatically filter semantic relations in context, with various groups of features that take into account information from the corpus used to build the thesaurus and contextual information related to occurrences of semantic neighbours 3).
context information is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Wintrode, Jonathan and Khudanpur, Sanjeev
Abstract
We aim to improve spoken term detection performance by incorporating contextual information beyond traditional N-gram language models.
Introduction
We will show that by focusing on contextual information in the form of word repetition within documents, we obtain consistent improvement across five languages in the so called Base Phase of the IARPA BABEL program.
Motivation
Clearly topic or context information is relevant to a retrieval type task, but we need a stable, consistent framework in which to apply it.
context information is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: