Index of papers in Proc. ACL 2012 that mention
  • word representations
Huang, Eric and Socher, Richard and Manning, Christopher and Ng, Andrew
Abstract
Unsupervised word representations are very useful in NLP tasks both as inputs to learning algorithms and as extra word features in NLP systems.
Conclusion
We presented a new neural network architecture that learns more semantic word representations by using both local and global context in learning.
Experiments
In order to show that our model learns more semantic word representations with global context, we give the nearest neighbors of our single-prototype model versus C&W’s, which only uses local context.
Global Context-Aware Neural Language Model
Our model jointly learns word representations while learning to discriminate the next word given a short word sequence (local context) and the document (global context) in which the word sequence occurs.
Global Context-Aware Neural Language Model
Because our goal is to learn useful word representations and not the probability of the next word given previous words (which prohibits looking ahead), our model can utilize the entire document to provide
Global Context-Aware Neural Language Model
The embedding matrix L is the word representations .
Introduction
The model learns word representations that better capture the semantics of words, while still keeping syntactic information.
Multi-Prototype Neural Language Model
Finally, each word occurrence in the corpus is relabeled to its associated cluster and is used to train the word representation for that cluster.
Related Work
Two other recent papers (Dhillon et al., 2011; Reddy et al., 2011) present models for constructing word representations that deal with context.
word representations is mentioned in 9 sentences in this paper.
Topics mentioned in this paper: