Index of papers in Proc. ACL 2013 that mention
  • vector representation
Socher, Richard and Bauer, John and Manning, Christopher D. and Andrew Y., Ng
Abstract
Instead, we introduce a Compositional Vector Grammar (CVG), which combines PCFGs with a syntactically untied recursive neural network that learns syntactico-semantic, compositional vector representations .
Introduction
Previous RNN-based parsers used the same (tied) weights at all nodes to compute the vector representing a constituent (Socher et al., 2011b).
Introduction
Therefore we combine syntactic and semantic information by giving the parser access to rich syntactico-semantic information in the form of distributional word vectors and compute compositional semantic vector representations for longer phrases (Costa et al., 2003; Menchetti et al., 2005; Socher et al., 2011b).
Introduction
We will first briefly introduce single word vector representations and then describe the CVG objective function, tree scoring and inference.
vector representation is mentioned in 9 sentences in this paper.
Topics mentioned in this paper:
Lazaridou, Angeliki and Marelli, Marco and Zamparelli, Roberto and Baroni, Marco
Experimental setup
Annotation of quality of test vectors The quality of the corpus-based vectors representing derived test items was determined by collecting human semantic similarity judgments in a crowdsourcing survey.
Experimental setup
The first experiment investigates to what extent composition models can approximate high-quality (HQ) corpus-extracted vectors representing derived forms.
Experimental setup
Lexfunc provides a flexible way to account for affixation, since it models it directly as a function mapping from and onto word vectors, without requiring a vector representation of bound affixes.
Related work
Although these works exploit vectors representing complex forms, they do not attempt to generate them compositionally.
vector representation is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Cheung, Jackie Chi Kit and Penn, Gerald
Distributional Semantic Hidden Markov Models
Unlike in most applications of HMMs in text processing, in which the representation of a token is simply its word or lemma identity, tokens in DSHMM are also associated with a vector representation of their meaning in context according to a distributional semantic model (Section 3.1).
Distributional Semantic Hidden Markov Models
All the methods below start from this basic vector representation .
Distributional Semantic Hidden Markov Models
Let event head h be the syntactic head of a number of arguments a1,a2, ...am, and 27h,27a1,27a2, ...27am be their respective vector representations according to the SIMPLE method.
vector representation is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Hermann, Karl Moritz and Blunsom, Phil
Background
The recursive application of autoencoders was first introduced in Pollack (1990), whose recursive auto-associative memories learn vector representations over pre-specified recursive data structures.
Learning
The unsupervised method described so far learns a vector representation for each sentence.
Model
Their purpose is to learn semantically meaningful vector representations for sentences and phrases of variable size, while the purpose of this paper is to investigate the use of syntax and linguistic formalisms in such vector-based compositional models.
vector representation is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: