Index of papers in Proc. ACL 2008 that mention
  • synset
Fang, Hui
Experiments
Experiment results show that the similarity function based on synset definitions is most effective.
Experiments
First, the similarity function based on synset definitions is the most effective one.
Experiments
As shown in Table 2, the similarity function based on synset definitions, i.e., sdef, is most effective.
Introduction
We find that the most effective way to utilize the information from WordNet is to compute the term similarity based on the overlap of synset definitions.
Term Similarity based on Lexical Resources
Every node in the WordNet is a synset , i.e., a set of synonyms.
Term Similarity based on Lexical Resources
The definition of a synset , which is referred to as gloss, is also provided.
Term Similarity based on Lexical Resources
For a query term, all the synsets in which the term appears can be returned, along with the definition of the synsets .
synset is mentioned in 12 sentences in this paper.
Topics mentioned in this paper:
Agirre, Eneko and Baldwin, Timothy and Martinez, David
Discussion
tween the two extremes of full synsets and SFs.
Experimental setting
As mentioned above, words in WordNet are organised into sets of synonyms, called synsets .
Experimental setting
Each synset in turn belongs to a unique semantic file (SF).
Experimental setting
We experiment with both full synsets and SFs as instances of fine-grained and coarse-grained semantic representation, respectively.
Integrating Semantics into Parsing
Our choice for this work was the WordNet 2.1 lexical database, in which synonyms are grouped into synsets , which are then linked via an ISA hierarchy.
Integrating Semantics into Parsing
With any lexical semantic resource, we have to be careful to choose the appropriate level of granularity for a given task: if we limit ourselves to synsets we will not be able to capture broader gen-eralisations, such as the one between knife and scissors;1 on the other hand by grouping words related at a higher level in the hierarchy we could find that we make overly coarse groupings (e.g.
Integrating Semantics into Parsing
1In WordNet 2.1, knife and scissors are sister synsets , both of which have TOOL as their 4th hypernym.
Results
In this case, synsets slightly outperform SF.
synset is mentioned in 10 sentences in this paper.
Topics mentioned in this paper:
Kennedy, Alistair and Szpakowicz, Stan
Comparison on applications
We consider 10 measures, noted in the table as J&C (Jiang and Conrath, 1997), Resnik (Resnik, 1995), Lin (Lin, 1998), W&P (Wu and Palmer, 1994), L&C (Leacock and Chodorow, 1998), H&SO (Hirst and St—Onge, 1998), Path (counts edges between synsets ), Lesk (Banerjee and Pedersen, 2002), and finally Vector and Vector Pair (Patwardhan, 2003).
Comparison on applications
We mean a concept in Roget’s to be either a Class, Section, ..., Semicolon Group, while a concept in WordNet is any synset .
Comparison on applications
Likewise, in WordNet if c were a synset, then each Ci would be a hyponym synset of 0.
synset is mentioned in 6 sentences in this paper.
Topics mentioned in this paper: