Gazetteer Induction 2.1 Induction by MN Clustering | The last word in the noun phase is then extracted and becomes the hypernym of the entity described by the article. |
Gazetteer Induction 2.1 Induction by MN Clustering | For example, from the following defining sentence, it extracts “guitarist” as the hypernym for “J imi Hendrix”. |
Gazetteer Induction 2.1 Induction by MN Clustering | # instances page titles processed 550,832 articles found 547,779 (found by redirection) (189,222) first sentences found 545,577 hypernyms extracted 482,599 |
Using Gazetteers as Features of NER | 8They handled “redirections” as well by following redirection links and extracting a hypernym from the article reached. |
Background | (2005) experimented with first-sense and hypernym features from HowNet and CiLin (both WordNets for Chinese) in a generative parse model applied to the Chinese Penn Treebank. |
Background | The combination of word sense and first-level hypernyms produced a significant improvement over their basic model. |
Integrating Semantics into Parsing | 1In WordNet 2.1, knife and scissors are sister synsets, both of which have TOOL as their 4th hypernym . |
Integrating Semantics into Parsing | Only by mapping them onto their lst hypernym or higher would we be able to capture the semantic generalisation alluded to above. |
Comparison on applications | The maximum for WordNet is 0.8506, where the mean is 3, or the first hypernym synset. |
Comparison on applications | This suggests that the POS and Head are most important for representing text in Roget’s Thesaurus, while the first hypernym is most important for representing text using WordNet. |
Introduction | These hypernym relations were also put towards solving analogy questions. |