Index of papers in Proc. ACL that mention
  • sense disambiguation
Li, Linlin and Roth, Benjamin and Sporleder, Caroline
Abstract
This paper presents a probabilistic model for sense disambiguation which chooses the best sense based on the conditional probability of sense paraphrases given a context.
Abstract
We propose three different instanti-ations of the model for solving sense disambiguation problems with different degrees of resource availability.
Abstract
The proposed models are tested on three different tasks: coarse-grained word sense disambiguation, fine-grained word sense disambiguation , and detection of literal vs. nonliteral usages of potentially idiomatic expressions.
Experimental Setup
Finally, we test our model on the related sense disambiguation task of distinguishing literal and nonliteral usages of potentially ambiguous expressions such as break the ice.
Experimental Setup
Sense Paraphrases For word sense disambiguation tasks, the paraphrases of the sense keys are represented by information from WordNet 2.1.
Introduction
Word sense disambiguation (WSD) is the task of automatically determining the correct sense for a target word given the context in which it occurs.
Introduction
Recently, several researchers have experimented with topic models (Brody and Lapata, 2009; Boyd-Graber et al., 2007; Boyd-Graber and Blei, 2007; Cai et al., 2007) for sense disambiguation and induction.
Introduction
Previous approaches using topic models for sense disambiguation either embed topic features in a supervised model (Cai et al., 2007) or rely heavily on the structure of hierarchical lexicons such as WordNet (Boyd-Graber et al., 2007).
Related Work
Recently, a number of systems have been proposed that make use of topic models for sense disambiguation .
The Sense Disambiguation Model
3.2 The Sense Disambiguation Model
sense disambiguation is mentioned in 21 sentences in this paper.
Topics mentioned in this paper:
Yao, Limin and Riedel, Sebastian and McCallum, Andrew
Abstract
Experimental results show our proposed approach discovers dramatically more accurate clusters than models without sense disambiguation , and that incorporating global features, such as the document theme, is crucial.
Conclusion
Experimental results show our approach discovers precise relation clusters and outperforms a generative model approach and a clustering method which does not address sense disambiguation .
Evaluations
Without using sense disambiguation , the performance of hierarchical clustering decreases significantly, losing 17% in precision in the pairwise measure, and 15% in terms of B3.
Evaluations
The clusters produced by HAC (without sense disambiguation ) is coherent if all the paths in one relation take a particular sense.
Experiments
For the sense disambiguation model, we set the number of topics (senses) to 50.
Experiments
One sense per path (HAC): This system uses only hierarchical clustering to discover relations, skipping sense disambiguation .
Our Approach
2.1 Sense Disambiguation
Related Work
Selectional preferences discovery (Ritter et al., 2010; Seaghdha, 2010) can help path sense disambiguation , however, we show that using global features performs better than entity type features.
Related Work
And our sense disambiguation model is inspired by this work.
Related Work
Our approach employs generative models for path sense disambiguation , which achieves better performance than directly applying generative models to unsupervised relation discovery.
sense disambiguation is mentioned in 10 sentences in this paper.
Topics mentioned in this paper:
Pilehvar, Mohammad Taher and Jurgens, David and Navigli, Roberto
A Unified Semantic Representation
However, traditional forms of word sense disambiguation are difficult for short texts and single words because little or no contextual information is present to perform the disambiguation task.
A Unified Semantic Representation
alignment-based sense disambiguation that leverages the content of the paired item in order to disambiguate each element.
A Unified Semantic Representation
Leveraging the paired item enables our approach to disambiguate where traditional sense disambiguation methods can not due to insufficient context.
Experiment 1: Textual Similarity
In addition, the system utilizes techniques such as Explicit Semantic Analysis (Gabrilovich and Markovitch, 2007) and makes use of resources such as Wiktionary and Wikipedia, a lexical substitution system based on supervised word sense disambiguation (Biemann, 2013), and a statistical machine translation system.
Experiment 2: Word Similarity
Our alignment-based sense disambiguation transforms the task of comparing individual words into that of calculating the similarity of the best-matching sense pair across the two words.
Related Work
However, unlike our approach, their method does not perform sense disambiguation prior to building the representation and therefore potentially suffers from ambiguity.
sense disambiguation is mentioned in 9 sentences in this paper.
Topics mentioned in this paper:
Gormley, Matthew R. and Mitchell, Margaret and Van Durme, Benjamin and Dredze, Mark
Approaches
We optionally include additional variables that perform word sense disambiguation for each predicate.
Experiments
To compare to prior work (i.e., submissions to the CoNLL-2009 Shared Task), we also consider the joint task of semantic role labeling and predicate sense disambiguation .
Experiments
Table 4(b) contrasts our high-resource results for the task of SRL and sense disambiguation with the top systems in the CoNLL-2009 Shared Task, giving further insight into the performance of the simple information gain feature selection technique.
Experiments
Table 5: F1 for SRL approaches (without sense disambiguation ) in matched and mismatched trairfltest settings for CoNLL 2005 span and 2008 head supervision.
sense disambiguation is mentioned in 7 sentences in this paper.
Topics mentioned in this paper:
Tanigaki, Koichi and Shiba, Mitsuteru and Munaka, Tatsuji and Sagisaka, Yoshinori
Abstract
This paper proposes a novel smoothing model with a combinatorial optimization scheme for all-words word sense disambiguation from untagged corpora.
Discussion
This means that the ranks of sense candidates for each word were frequently altered through iteration, which further means that some new information not obtained earlier was delivered one after another to sense disambiguation for each word.
Discussion
From these results, we could confirm the expected sense-interdependency effect that a sense disambiguation of certain word affected to other words.
Discussion
In our method, sense disambiguation of a word is guided by its nearby words’ extrapolation (smoothing).
Introduction
Word Sense Disambiguation (WSD) is a task to identify the intended sense of a word based on its context.
sense disambiguation is mentioned in 6 sentences in this paper.
Topics mentioned in this paper:
Xiong, Deyi and Zhang, Min
Abstract
Our method is significantly different from preVious word sense disambiguation reformulated for machine translation in that the latter neglects word senses in nature.
Abstract
Results show that the proposed model substantially outperforms not only the baseline but also the preVious reformulated word sense disambiguation .
Experiments
5.5 Comparison to Word Sense Disambiguation
Introduction
Therefore a natural assumption is that word sense disambiguation (WSD) may contribute to statistical machine translation (SMT) by providing appropriate word senses for target translation selection with context features (Carpuat and Wu, 2005).
WSI-Based Broad-Coverage Sense Tagger
The biggest difference from word sense disambiguation lies in that WSI does not rely on a predefined sense inventory.
sense disambiguation is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Carpuat, Marine and Daume III, Hal and Henry, Katharine and Irvine, Ann and Jagarlamudi, Jagadeesh and Rudinger, Rachel
Introduction
(2002) observed, the domain of the text that a word occurs in is a useful signal for performing word sense disambiguation (e.g.
Introduction
We operate under the framework of phrase sense disambiguation (Carpuat and Wu, 2007), in which we take automatically align parallel data in an old domain to generate an initial old-domain sense inventory.
New Sense Indicators
Towards this end, first, we pose the problem as a phrase sense disambiguation (PSD) problem over the known sense inventory.
Related Work
While word senses have been studied extensively in lexical semantics, research has focused on word sense disambiguation , the task of disambiguating words in context given a predefined sense inventory (e.g., Agirre and Edmonds (2006)), and word sense induction, the task of learning sense inventories from text (e. g., Agirre and Soroa (2007)).
Related Work
Chan and Ng (2007) notably show that detecting changes in predominant sense as modeled by domain sense priors can improve sense disambiguation , even after performing adaptation using active learning.
sense disambiguation is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Mitra, Sunny and Mitra, Ritwik and Riedl, Martin and Biemann, Chris and Mukherjee, Animesh and Goyal, Pawan
Abstract
Our approach can be applied for lexicography, as well as for applications like word sense disambiguation or semantic search.
Introduction
Two of the fundamental components of a natural language communication are word sense discovery (Jones, 1986) and word sense disambiguation (Ide and Veronis, 1998).
Related work
Word sense disambiguation as well as word sense discovery have both remained key areas of research right from the very early initiatives in natural language processing research.
Related work
Ide and Vero-nis (1998) present a very concise survey of the history of ideas used in word sense disambiguation ; for a recent survey of the state-of-the-art one can refer to (Navigli, 2009).
sense disambiguation is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Khapra, Mitesh M. and Joshi, Salil and Chatterjee, Arindam and Bhattacharyya, Pushpak
Abstract
Recent work on bilingual Word Sense Disambiguation (WSD) has shown that a resource deprived language (L1) can benefit from the annotation work done in a resource rich language (L2) via parameter projection.
Conclusion
We presented a bilingual bootstrapping algorithm for Word Sense Disambiguation which allows two resource deprived languages to mutually benefit
Parameter Projection
(2009) proposed that the various parameters essential for domain-specific Word Sense Disambiguation can be broadly classified into two categories:
Related Work
Bootstrapping for Word Sense Disambiguation was first discussed in (Yarowsky, 1995).
sense disambiguation is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Kawahara, Daisuke and Peterson, Daniel W. and Palmer, Martha
Conclusion
As applications of the resulting semantic frames and verb classes, we plan to integrate them into syntactic parsing, semantic role labeling and verb sense disambiguation .
Introduction
Such verb classes have been used in many NLP applications that need to consider semantics in particular, such as word sense disambiguation (Dang, 2004), semantic parsing (Swier and Stevenson, 2005; Shi and Mihalcea, 2005) and discourse parsing (Subba and Di Eugenio, 2009).
Our Approach
For each predicate-argument structure of a verb, we couple the verb and an argument to make a unit for sense disambiguation .
Related Work
They conducted several evaluations including predominant class induction and token-level verb sense disambiguation , but did not evaluate multiple classes output by their models.
sense disambiguation is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
De Benedictis, Flavio and Faralli, Stefano and Navigli, Roberto
Conclusions
Beyond the immediate usability of its output and its effective use for domain Word Sense Disambiguation (Faralli and Navigli, 2012), we wish to show the benefit of GlossBoot in gloss-driven approaches to ontology learning (Navigli et al., 2011; Velardi et al., 2013) and semantic network enrichment (Navigli and Ponzetto, 2012).
Introduction
Interestingly, electronic glossaries have been shown to be key resources not only for humans, but also in Natural Language Processing (NLP) tasks such as Question Answering (Cui et al., 2007), Word Sense Disambiguation (Duan and Yates, 2010; Faralli and Navigli, 2012) and ontology learning (Navigli et al., 2011; Velardi et al., 2013).
Related Work
and Curran, 2008; McIntosh and Curran, 2009), learning semantic relations (Pantel and Pennac-chiotti, 2006), extracting surface text patterns for open-domain question answering (Ravichandran and Hovy, 2002), semantic tagging (Huang and Riloff, 2010) and unsupervised Word Sense Disambiguation (Yarowsky, 1995).
sense disambiguation is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Pilehvar, Mohammad Taher and Navigli, Roberto
Experiments
The edges obtained from unambiguous entries are essentially sense disambiguated on both sides whereas those obtained from ambiguous terms are a result of our similarity-based disambiguation.
Introduction
Owing to its ability to bring together features like multilin-guality and increasing coverage, over the past few years resource alignment has proven beneficial to a wide spectrum of tasks, such as Semantic Parsing (Shi and Mihalcea, 2005), Semantic Role Labeling (Palmer et al., 2010), and Word Sense Disambiguation (Navigli and Ponzetto, 2012).
Resource Alignment
PPR has been previously used in a wide variety of tasks such as definition similarity-based resource alignment (Niemann and Gurevych, 2011), textual semantic similarity (Hughes and Ramage, 2007; Pilehvar et al., 2013), Word Sense Disambiguation (Agirre and Soroa, 2009; Faralli and Navigli, 2012) and semantic text categorization (Navigli et al., 2011).
sense disambiguation is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Popat, Kashyap and A.R, Balamurali and Bhattacharyya, Pushpak and Haffari, Gholamreza
Clustering for Sentiment Analysis
by using automatic/manual sense disambiguation techniques.
Discussions
The sense disambiguation accuracy of the same would have lowered in a cross-domain setting.
Introduction
WordNets are primarily used to address the problem of word sense disambiguation .
sense disambiguation is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Lim, Lian Tze and Soon, Lay-Ki and Lim, Tek Yong and Tang, Enya Kong and Ranaivo-Malançon, Bali
Abstract
Current approaches for word sense disambiguation and translation selection typically require lexical resources or large bilingual corpora with rich information fields and annotations, which are often infeasible for under-resourced languages.
Introduction
Word sense disambiguation (WSD) is the task of assigning sense tags to ambiguous lexical items (Us) in a text.
Introduction
It can also be viewed as a simplified version of the Cross-Lingual Lexical Substitution (Mihalcea et al., 2010) and Cross-Lingual Word Sense Disambiguation (Lefever and Hoste, 2010) tasks, as defined in SemEval-2010.
sense disambiguation is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Agirre, Eneko and Baldwin, Timothy and Martinez, David
Discussion
The results of the previous section show that the improvements in parsing results are small but significant, for all three word sense disambiguation strategies (gold-standard, 1ST and ASR).
Integrating Semantics into Parsing
This problem of identifying the correct sense of a word in context is known as word sense disambiguation (WSD: Agirre and Edmonds (2006)).
Introduction
use of the most frequent sense, and an unsupervised word sense disambiguation (WSD) system.
sense disambiguation is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Baldwin, Tyler and Li, Yunyao and Alexe, Bogdan and Stanoi, Ioana R.
Introduction
Several NLP tasks, such as word sense disambiguation , word sense induction, and named entity disambiguation, address this ambiguity problem to varying degrees.
Related Work
the well studied problems of named entity disambiguation (NED) and word sense disambiguation (WSD).
Related Work
Both named entity and word sense disambiguation are extensively studied, and surveys on each are available (Nadeau and Sekine, 2007; Navigli, 2009).
sense disambiguation is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Zhong, Zhi and Ng, Hwee Tou
Abstract
Previous research has conflicting conclusions on whether word sense disambiguation (WSD) systems can improve information retrieval (IR) performance.
Introduction
Word sense disambiguation (WSD) is the task of identifying the correct meaning of a word in context.
Word Sense Disambiguation
4.1 Word sense disambiguation system
sense disambiguation is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Shutova, Ekaterina
Automatic Metaphor Recognition
This idea originates from a similarity-based word sense disambiguation method developed by Karov and Edelman (1998).
Metaphor Annotation in Corpora
To reflect two distinct aspects of the phenomenon, metaphor annotation can be split into two stages: identifying metaphorical senses in text (akin word sense disambiguation ) and annotating source — target domain mappings underlying the production of metaphorical expressions.
Metaphor Annotation in Corpora
Such annotation can be viewed as a form of word sense disambiguation with an emphasis on metaphoricity.
sense disambiguation is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Chambers, Nathanael and Jurafsky, Daniel
Abstract
While pseudo-words originally evaluated word sense disambiguation , they are now commonly used to evaluate selectional preferences.
History of Pseudo-Word Disambiguation
Pseudo-words were introduced simultaneously by two papers studying statistical approaches to word sense disambiguation (WSD).
Introduction
One way to mitigate this problem is with pseudo-words, a method for automatically creating test corpora without human labeling, originally proposed for word sense disambiguation (Gale et al.,
sense disambiguation is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Davidov, Dmitry and Rappoport, Ari
Abstract
Our NR classification evaluation strictly follows the ACL SemEval-07 Task 4 datasets and protocol, obtaining an f-score of 70.6, as opposed to 64.8 of the best previous work that did not use the manually provided WordNet sense disambiguation tags.
Conclusion
In practical situations, it would not be feasible to provide a large amount of such sense disambiguation tags manually.
Introduction
Furthermore, usage of such resources frequently requires disambiguation and connection of the data to the resource (word sense disambiguation in the case of WordNet).
sense disambiguation is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: