Index of papers in Proc. ACL 2014 that mention
  • lexical semantic
Jansen, Peter and Surdeanu, Mihai and Clark, Peter
Abstract
We propose a robust answer reranking model for non-factoid questions that integrates lexical semantics with discourse information, driven by two representations of discourse: a shallow representation centered around discourse markers, and a deep one based on Rhetorical Structure Theory.
Abstract
We experimentally demonstrate that the discourse structure of non-factoid answers provides information that is complementary to lexical semantic similarity between question and answer, improving performance up to 24% (relative) over a state-of-the-art model that exploits lexical semantic similarity alone.
CR + LS + DMM + DPM 39.32* +24% 47.86* +20%
Lexical semantic features increase performance for all settings, but demonstrate far more utility to the open-domain YA corpus.
CR + LS + DMM + DPM 39.32* +24% 47.86* +20%
This disparity is likely due to the difficulty in assembling LS training data at an appropriate level for the biology corpus, contrasted with the relative abundance of large scale open-domain lexical semantic resources.
CR + LS + DMM + DPM 39.32* +24% 47.86* +20%
For the YA corpus, where lexical semantics showed the most benefit, simply adding
Experiments
Lexical Semantics : We trained two different RNNLMs for this work.
Introduction
We propose a novel answer reranking (AR) model that combines lexical semantics (LS) with discourse information, driven by two representations of discourse: a shallow representation centered around discourse markers and surface text information, and a deep one based on the Rhetorical Structure Theory (RST) discourse framework (Mann and Thompson, 1988).
Models and Features
4.3 Lexical Semantics Model
Models and Features
(2013), we include lexical semantics in our reranking model.
Related Work
Inspired by this previous work and recent work in discourse parsing (Feng and Hirst, 2012), our work is the first to systematically explore structured discourse features driven by several discourse representations, combine discourse with lexical semantic models, and evaluate these representations on thousands of questions using both in-domain and cross-domain experiments.
lexical semantic is mentioned in 12 sentences in this paper.
Topics mentioned in this paper:
Xu, Liheng and Liu, Kang and Lai, Siwei and Zhao, Jun
Abstract
Lexical semantic clue verifies whether a candidate term is related to the target product, and contextual semantic clue serves as a soft pattern miner to find candidates, which exploits semantics of each word in context so as to alleviate the data sparsity problem.
Abstract
We build a semantic similarity graph to encode lexical semantic clue, and employ a convolutional neural model to capture contextual semantic clue.
Experiments
LEX only uses lexical semantic clue.
Experiments
Furthermore, LEX gets better recall than CONT and all syntax-based methods, which indicates that lexical semantic clue does aid to mine more infrequent features as expected.
Introduction
We call it lexical semantic clue.
Introduction
Then, based on the assumption that terms that are more semantically similar to the seeds are more likely to be product features, a graph which measures semantic similarities between terms is built to capture lexical semantic clue.
The Proposed Method
Then, a semantic similarity graph is created to capture lexical semantic clue, and a Convolutional Neural Network (CNN) (Collobert et al., 2011) is trained in each bootstrapping iteration to encode contextual semantic clue.
The Proposed Method
3.2 Capturing Lexical Semantic Clue in a Semantic Similarity Graph
The Proposed Method
To capture lexical semantic clue, each word is first converted into word embedding, which is a continuous vector with each dimension’s value corresponds to a semantic or grammatical interpretation (Turian et al., 2010).
lexical semantic is mentioned in 17 sentences in this paper.
Topics mentioned in this paper:
Bamman, David and Dyer, Chris and Smith, Noah A.
Conclusion
While our results use geographical information in learning low-dimensional representations, other contextual variables are straightforward to include as well; incorporating effects for time — such as time of day, month of year and absolute year — may be a powerful tool for revealing periodic and historical influences on lexical semantics .
Introduction
In this paper, we introduce a method that extends vector-space lexical semantic models to learn representations of geographically situated language.
Introduction
Vector-space models of lexical semantics have been a popular and effective approach to learning representations of word meaning (Lin, 1998; Tumey and Pantel, 2010; Reisinger and Mooney, 2010; Socher et al., 2013; Mikolov et al., 2013, inter alia).
lexical semantic is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Baroni, Marco and Dinu, Georgiana and Kruszewski, Germán
Abstract
In this paper, we perform such an extensive evaluation, on a wide range of lexical semantics tasks and across many parameter settings.
Conclusion
Add to this that, beyond the standard lexical semantics challenges we tested here, predict models are currently been successfully applied in cutting-edge domains such as representing phrases (Mikolov et al., 2013c; Socher et al., 2012) or fusing language and vision in a common semantic space (Frome et al., 2013; Socher et al., 2013).
Introduction
In this paper, we overcome the comparison scarcity problem by providing a direct evaluation of count and predict DSMs across many parameter settings and on a large variety of mostly standard lexical semantics benchmarks.
lexical semantic is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Bengoetxea, Kepa and Agirre, Eneko and Nivre, Joakim and Zhang, Yue and Gojenola, Koldo
Introduction
This work presents a set of experiments to investigate the use of lexical semantic information in dependency parsing of English.
Introduction
Whether semantics improve parsing is one interesting research topic both on parsing and lexical semantics .
Introduction
Broadly speaking, we can classify the methods to incorporate semantic information into parsers in two: systems using static lexical semantic repositories, such as WordNet or similar ontologies (Agirre et al., 2008; Agirre et al., 2011; Fujita et al., 2010), and systems using dynamic semantic clusters automatically acquired from corpora (Koo et al., 2008; Suzuki et al., 2009).
lexical semantic is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Sun, Le and Han, Xianpei
Introduction
Therefore, we enrich each phrase node with features about its lexical pattern, its content information, and its lexical semantics:
Introduction
3) Lexical Semantics .
Introduction
If the node is a preterminal node, we capture its lexical semantic by adding features indicating its WordNet sense information.
lexical semantic is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: