Index of papers in Proc. ACL that mention
  • lexical semantic
Agirre, Eneko and Baldwin, Timothy and Martinez, David
Background
This research is focused on applying lexical semantics in parsing and PP attachment tasks.
Background
Lexical semantics in parsing
Background
Other notable examples of the successful incorporation of lexical semantics into parsing, not through word sense information but indirectly via selectional preferences, are Dowding et al.
Conclusions
This simple method allows us to incorporate lexical semantic information into the parser, without having to reimplement a full statistical parser.
Conclusions
The results are highly significant in demonstrating that a simplistic approach to incorporating lexical semantics into a parser significantly improves parser performance.
Discussion
The fact that the improvement is larger for PP attachment than for full parsing is suggestive of PP attachment being a parsing subtask where lexical semantic information is particularly important, supporting the findings of Stetina and Nagao (1997) over a standalone PP attachment task.
Discussion
Our hope is that this paper serves as the bridgehead for a new line of research into the impact of lexical semantics on parsing.
Integrating Semantics into Parsing
With any lexical semantic resource, we have to be careful to choose the appropriate level of granularity for a given task: if we limit ourselves to synsets we will not be able to capture broader gen-eralisations, such as the one between knife and scissors;1 on the other hand by grouping words related at a higher level in the hierarchy we could find that we make overly coarse groupings (e.g.
Introduction
Our approach to exploring the impact of lexical semantics on parsing performance is to take two state-of-the-art statistical treebank parsers and pre-process the inputs variously.
Introduction
Given our simple procedure for incorporating lexical semantics into the parsing process, our hope is that this research will open the door to further gains using more sophisticated parsing models and richer semantic options.
Results
The performance gain obtained here is larger than in parsing, which is in accordance with the findings of Stetina and Nagao that lexical semantics has a considerable effect on PP attachment
lexical semantic is mentioned in 11 sentences in this paper.
Topics mentioned in this paper:
Das, Dipanjan and Smith, Noah A.
Abstract
The model cleanly incorporates both syntax and lexical semantics using quasi-synchronous dependency grammars (Smith and Eisner, 2006).
Conclusion
In this paper, we have presented a probabilistic model of paraphrase incorporating syntax, lexical semantics , and hidden loose alignments between two sentences’ trees.
Experimental Evaluation
We removed the lexical semantics component of the QG,10 and disallowed the syntactic configurations one by one, to investigate which components of mg contributes to system performance.
Experimental Evaluation
The lexical semantics component is critical, as seen by the drop in accuracy from the table (without this component, pQ behaves almost like the “all p” baseline).
Introduction
Because dependency syntax is still only a crude approximation to semantic structure, we augment the model with a lexical semantics component, based on WordNet (Miller, 1995), that models how words are probabilistically altered in generating a paraphrase.
Introduction
This combination of loose syntax and lexical semantics is similar to the “Jeopardy” model of Wang et al.
QG for Paraphrase Modeling
(2007) in treating the correspondences as latent variables, and in using a WordNet—based lexical semantics model to generate the target words.
QG for Paraphrase Modeling
5 We use log-linear models three times: for the configuration, the lexical semantics class, and the word.
QG for Paraphrase Modeling
WordNet relation(s) The model next chooses a lexical semantics relation between 3360-) and the yet-to-be-chosen word ti (line 12).
lexical semantic is mentioned in 11 sentences in this paper.
Topics mentioned in this paper:
Yih, Wen-tau and Chang, Ming-Wei and Meek, Christopher and Pastusiak, Andrzej
Abstract
Unlike previous work, which primarily leverages syntactic analysis through dependency tree matching, we focus on improving the performance using models of lexical semantic resources.
Abstract
Experiments show that our systems can be consistently and significantly improved with rich lexical semantic information, regardless of the choice of learning algorithms.
Introduction
nent, lexical semantics .
Introduction
We formulate answer selection as a semantic matching problem with a latent word-alignment structure as in (Chang et al., 2010) and conduct a series of experimental studies on leveraging recently proposed lexical semantic models.
Introduction
First, by incorporating the abundant information from a variety of lexical semantic models, the answer selection system can be enhanced substantially, regardless of the choice of learning algorithms and settings.
Problem Definition
1For example, Heilman and Smith (2010) emphasized that “The tree edit model, which does not use lexical semantics knowledge, produced the best result reported to date.”
Problem Definition
In this work, we focus our study on leveraging the low-level semantic cues from recently proposed lexical semantic models.
Related Work
Although lexical semantic information derived from WordNet has been used in some of these approaches, the research has mainly focused on modeling the mapping between the syntactic structures of questions and sentences, produced from syntactic analysis.
Related Work
The potential improvement from enhanced lexical semantic models seems to have been deliberately overlooked.1
lexical semantic is mentioned in 27 sentences in this paper.
Topics mentioned in this paper:
Jansen, Peter and Surdeanu, Mihai and Clark, Peter
Abstract
We propose a robust answer reranking model for non-factoid questions that integrates lexical semantics with discourse information, driven by two representations of discourse: a shallow representation centered around discourse markers, and a deep one based on Rhetorical Structure Theory.
Abstract
We experimentally demonstrate that the discourse structure of non-factoid answers provides information that is complementary to lexical semantic similarity between question and answer, improving performance up to 24% (relative) over a state-of-the-art model that exploits lexical semantic similarity alone.
CR + LS + DMM + DPM 39.32* +24% 47.86* +20%
Lexical semantic features increase performance for all settings, but demonstrate far more utility to the open-domain YA corpus.
CR + LS + DMM + DPM 39.32* +24% 47.86* +20%
This disparity is likely due to the difficulty in assembling LS training data at an appropriate level for the biology corpus, contrasted with the relative abundance of large scale open-domain lexical semantic resources.
CR + LS + DMM + DPM 39.32* +24% 47.86* +20%
For the YA corpus, where lexical semantics showed the most benefit, simply adding
Experiments
Lexical Semantics : We trained two different RNNLMs for this work.
Introduction
We propose a novel answer reranking (AR) model that combines lexical semantics (LS) with discourse information, driven by two representations of discourse: a shallow representation centered around discourse markers and surface text information, and a deep one based on the Rhetorical Structure Theory (RST) discourse framework (Mann and Thompson, 1988).
Models and Features
4.3 Lexical Semantics Model
Models and Features
(2013), we include lexical semantics in our reranking model.
Related Work
Inspired by this previous work and recent work in discourse parsing (Feng and Hirst, 2012), our work is the first to systematically explore structured discourse features driven by several discourse representations, combine discourse with lexical semantic models, and evaluate these representations on thousands of questions using both in-domain and cross-domain experiments.
lexical semantic is mentioned in 12 sentences in this paper.
Topics mentioned in this paper:
Xu, Liheng and Liu, Kang and Lai, Siwei and Zhao, Jun
Abstract
Lexical semantic clue verifies whether a candidate term is related to the target product, and contextual semantic clue serves as a soft pattern miner to find candidates, which exploits semantics of each word in context so as to alleviate the data sparsity problem.
Abstract
We build a semantic similarity graph to encode lexical semantic clue, and employ a convolutional neural model to capture contextual semantic clue.
Experiments
LEX only uses lexical semantic clue.
Experiments
Furthermore, LEX gets better recall than CONT and all syntax-based methods, which indicates that lexical semantic clue does aid to mine more infrequent features as expected.
Introduction
We call it lexical semantic clue.
Introduction
Then, based on the assumption that terms that are more semantically similar to the seeds are more likely to be product features, a graph which measures semantic similarities between terms is built to capture lexical semantic clue.
The Proposed Method
Then, a semantic similarity graph is created to capture lexical semantic clue, and a Convolutional Neural Network (CNN) (Collobert et al., 2011) is trained in each bootstrapping iteration to encode contextual semantic clue.
The Proposed Method
3.2 Capturing Lexical Semantic Clue in a Semantic Similarity Graph
The Proposed Method
To capture lexical semantic clue, each word is first converted into word embedding, which is a continuous vector with each dimension’s value corresponds to a semantic or grammatical interpretation (Turian et al., 2010).
lexical semantic is mentioned in 17 sentences in this paper.
Topics mentioned in this paper:
Bernhard, Delphine and Gurevych, Iryna
Abstract
In this paper, we propose to use as a parallel training dataset the definitions and glosses provided for the same term by different lexical semantic resources.
Abstract
We compare monolingual translation models built from lexical semantic resources with two other kinds of datasets: manually-tagged question reformulations and question-answer pairs.
Conclusion and Future Work
We have presented three datasets for training statistical word translation models for use in answer finding: question-answer pairs, manually-tagged question reformulations and glosses for the same term extracted from several lexical semantic resources.
Conclusion and Future Work
question-answer pairs, and external knowledge, as contained in lexical semantic resources.
Introduction
We use the definitions and glosses provided for the same term by different lexical semantic resources to automatically train the translation models.
Introduction
This approach has been very recently made possible by the emergence of new kinds of lexical semantic and encyclopedic resources such as Wikipedia and Wiktionary.
Parallel Datasets
3.2 Lexical Semantic Resources
Parallel Datasets
Glosses and definitions for the same lexeme in different lexical semantic and encyclopedic resources can actually be considered as near-paraphrases, since they define the same terms and hence have
Related Work
We henceforth propose a new approach for building monolingual translation models relying on domain-independent lexical semantic resources.
Related Work
Knowledge-based measures rely on lexical semantic resources such as WordNet and comprise path length based measures (Rada et al., 1989) and concept vector based measures (Qiu and Frei, 1993).
lexical semantic is mentioned in 10 sentences in this paper.
Topics mentioned in this paper:
Chang, Kai-min K. and Cherkassky, Vladimir L. and Mitchell, Tom M. and Just, Marcel Adam
Brain Imaging Experiments on Adj ec-tive-Noun Comprehension
4.1 Lexical Semantic Representation
Brain Imaging Experiments on Adj ec-tive-Noun Comprehension
The lexical semantic representation for strong and dog.
Introduction
How humans represent meanings of individual words and how lexical semantic knowledge is combined to form complex concepts are issues fundamental to the study of human knowledge.
Introduction
Given these early succesess in using fMRI to discriminate categorial information and to model lexical semantic representations of individual words, it is interesting to ask whether a similar approach can be used to study the representation of adjective-noun phrases.
Introduction
In section 4, we discuss a vector-based approach to modeling the lexical semantic knowledge using word occurrence measures in a text corpus.
lexical semantic is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Reiter, Nils and Frank, Anette
Introduction
Lexical semantic factors, such as the semantic type of the clause predicate (5.c,e), or “well-established” kinds (5.g) may favour a generic reading, but such lexical factors are difficult to capture in a rule-based setting.
Introduction
In the following, we will structure this feature space along two dimensions, distinguishing NP- and sentence-level factors as well as syntactic and semantic (including lexical semantic ) factors.
Introduction
Semantic features include semantic features abstracted from syntax, such as tense and aspect or type of modification, but also lexical semantic features such as word sense classes, sense granularity or verbal predicates.
lexical semantic is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Plank, Barbara and Moschitti, Alessandro
Conclusions and Future Work
We proposed syntactic tree kernels enriched by lexical semantic similarity to tackle the portability of a relation extractor to different domains.
Introduction
In the empirical evaluation on Automatic Content Extraction (ACE) data, we evaluate the impact of convolution tree kernels embedding lexical semantic similarities.
Results
The same holds also for the lexical semantic kernel based on LSA (P_LSA), however, to only two out of three domains.
Results
As the two semantically enriched kernels, PETLSA and PET_WC, seem to capture different information we use composite kernels (rows 10-11): the baseline kernel (PET) summed with the lexical semantic kernels.
lexical semantic is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Mohler, Michael and Bunescu, Razvan and Mihalcea, Rada
Abstract
We combine several graph alignment features with lexical semantic similarity measures using machine learning techniques and show that the student answers can be more accurately graded than if the semantic measures were used in isolation.
Answer Grading System
3.3 Lexical Semantic Similarity
Answer Grading System
We combine the alignment scores $001,, A8) with the scores ¢B(Ai, As) from the lexical semantic similarity measures into a single feature vector ¢(A,-,AS) = [¢G(A,-,AS)|¢B(A,-,AS)].
lexical semantic is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Wolfe, Travis and Van Durme, Benjamin and Dredze, Mark and Andrews, Nicholas and Beller, Charley and Callison-Burch, Chris and DeYoung, Jay and Snyder, Justin and Weese, Jonathan and Xu, Tan and Yao, Xuchen
Conclusion
It builds on the development of lexical semantic resources and provides a platform for learning to utilize these resources.
Introduction
As opposed to Roth and Frank, PARMA is designed as a a trainable platform for the incorporation of the sort of lexical semantic resources used in the related areas of Recognizing Textual Entailment (RTE) and Question Answering (QA).
PARMA
The focus of PARMA is the integration of a diverse range of features based on existing lexical semantic resources.
lexical semantic is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Bamman, David and Dyer, Chris and Smith, Noah A.
Conclusion
While our results use geographical information in learning low-dimensional representations, other contextual variables are straightforward to include as well; incorporating effects for time — such as time of day, month of year and absolute year — may be a powerful tool for revealing periodic and historical influences on lexical semantics .
Introduction
In this paper, we introduce a method that extends vector-space lexical semantic models to learn representations of geographically situated language.
Introduction
Vector-space models of lexical semantics have been a popular and effective approach to learning representations of word meaning (Lin, 1998; Tumey and Pantel, 2010; Reisinger and Mooney, 2010; Socher et al., 2013; Mikolov et al., 2013, inter alia).
lexical semantic is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Baroni, Marco and Dinu, Georgiana and Kruszewski, Germán
Abstract
In this paper, we perform such an extensive evaluation, on a wide range of lexical semantics tasks and across many parameter settings.
Conclusion
Add to this that, beyond the standard lexical semantics challenges we tested here, predict models are currently been successfully applied in cutting-edge domains such as representing phrases (Mikolov et al., 2013c; Socher et al., 2012) or fusing language and vision in a common semantic space (Frome et al., 2013; Socher et al., 2013).
Introduction
In this paper, we overcome the comparison scarcity problem by providing a direct evaluation of count and predict DSMs across many parameter settings and on a large variety of mostly standard lexical semantics benchmarks.
lexical semantic is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Bengoetxea, Kepa and Agirre, Eneko and Nivre, Joakim and Zhang, Yue and Gojenola, Koldo
Introduction
This work presents a set of experiments to investigate the use of lexical semantic information in dependency parsing of English.
Introduction
Whether semantics improve parsing is one interesting research topic both on parsing and lexical semantics .
Introduction
Broadly speaking, we can classify the methods to incorporate semantic information into parsers in two: systems using static lexical semantic repositories, such as WordNet or similar ontologies (Agirre et al., 2008; Agirre et al., 2011; Fujita et al., 2010), and systems using dynamic semantic clusters automatically acquired from corpora (Koo et al., 2008; Suzuki et al., 2009).
lexical semantic is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Sun, Le and Han, Xianpei
Introduction
Therefore, we enrich each phrase node with features about its lexical pattern, its content information, and its lexical semantics:
Introduction
3) Lexical Semantics .
Introduction
If the node is a preterminal node, we capture its lexical semantic by adding features indicating its WordNet sense information.
lexical semantic is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: