Index of papers in Proc. ACL 2013 that mention
  • lexical semantic
Yih, Wen-tau and Chang, Ming-Wei and Meek, Christopher and Pastusiak, Andrzej
Abstract
Unlike previous work, which primarily leverages syntactic analysis through dependency tree matching, we focus on improving the performance using models of lexical semantic resources.
Abstract
Experiments show that our systems can be consistently and significantly improved with rich lexical semantic information, regardless of the choice of learning algorithms.
Introduction
nent, lexical semantics .
Introduction
We formulate answer selection as a semantic matching problem with a latent word-alignment structure as in (Chang et al., 2010) and conduct a series of experimental studies on leveraging recently proposed lexical semantic models.
Introduction
First, by incorporating the abundant information from a variety of lexical semantic models, the answer selection system can be enhanced substantially, regardless of the choice of learning algorithms and settings.
Problem Definition
1For example, Heilman and Smith (2010) emphasized that “The tree edit model, which does not use lexical semantics knowledge, produced the best result reported to date.”
Problem Definition
In this work, we focus our study on leveraging the low-level semantic cues from recently proposed lexical semantic models.
Related Work
Although lexical semantic information derived from WordNet has been used in some of these approaches, the research has mainly focused on modeling the mapping between the syntactic structures of questions and sentences, produced from syntactic analysis.
Related Work
The potential improvement from enhanced lexical semantic models seems to have been deliberately overlooked.1
lexical semantic is mentioned in 27 sentences in this paper.
Topics mentioned in this paper:
Plank, Barbara and Moschitti, Alessandro
Conclusions and Future Work
We proposed syntactic tree kernels enriched by lexical semantic similarity to tackle the portability of a relation extractor to different domains.
Introduction
In the empirical evaluation on Automatic Content Extraction (ACE) data, we evaluate the impact of convolution tree kernels embedding lexical semantic similarities.
Results
The same holds also for the lexical semantic kernel based on LSA (P_LSA), however, to only two out of three domains.
Results
As the two semantically enriched kernels, PETLSA and PET_WC, seem to capture different information we use composite kernels (rows 10-11): the baseline kernel (PET) summed with the lexical semantic kernels.
lexical semantic is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Wolfe, Travis and Van Durme, Benjamin and Dredze, Mark and Andrews, Nicholas and Beller, Charley and Callison-Burch, Chris and DeYoung, Jay and Snyder, Justin and Weese, Jonathan and Xu, Tan and Yao, Xuchen
Conclusion
It builds on the development of lexical semantic resources and provides a platform for learning to utilize these resources.
Introduction
As opposed to Roth and Frank, PARMA is designed as a a trainable platform for the incorporation of the sort of lexical semantic resources used in the related areas of Recognizing Textual Entailment (RTE) and Question Answering (QA).
PARMA
The focus of PARMA is the integration of a diverse range of features based on existing lexical semantic resources.
lexical semantic is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: