Index of papers in Proc. ACL 2010 that mention
  • vector representations
Thater, Stefan and Fürstenau, Hagen and Pinkal, Manfred
Conclusion
We have presented a novel method for adapting the vector representations of words according to their context.
Introduction
Second, the vectors of two syntactically related words, e. g., a target verb acquire and its direct object knowledge, typically have different syntactic environments, which implies that their vector representations encode complementary information and there is no direct way of combining the information encoded in the respective vectors.
Introduction
To solve these problems, we build upon previous work (Thater et al., 2009) and propose to use syntactic second-order vector representations .
Introduction
Second-order vector representations in a bag-of-words setting were first used by Schutze (1998); in a syntactic setting, they also feature in Dligach and Palmer (2008).
Related Work
Several approaches to contextualize vector representations of word meaning have been proposed.
Related Work
By using vector representations of a predicate p and an argument a, Kintsch identifies words
Related Work
Mitchell and Lapata (2008), henceforth M&L, propose a general framework in which meaning representations for complex expressions are computed compositionally by combining the vector representations of the individual words of the complex expression.
The model
In this section, we present our method of contextualizing semantic vector representations .
The model
Our model employs vector representations for words and expressions containing syntax-specific first and second order co-occurrences information.
The model
The basis for the construction of both kinds of vector representations are co-occurrence graphs.
vector representations is mentioned in 11 sentences in this paper.
Topics mentioned in this paper:
Mitchell, Jeff and Lapata, Mirella and Demberg, Vera and Keller, Frank
Integrating Semantic Constraint into Surprisal
The factor A(wn, h) is essentially based on a comparison between the vector representing the current word wn and the vector representing the prior history h. Varying the method for constructing word vectors (e. g., using LDA or a simpler semantic space model) and for combining them into a representation of the prior context h (e.g., using additive or multiplicative functions) produces distinct models of semantic composition.
Integrating Semantic Constraint into Surprisal
The calculation of A is then based on a weighted dot product of the vector representing the upcoming word w, with the vector representing the prior context h:
Models of Processing Difficulty
In this framework, the similarity between two words can be easily quantified, e.g., by measuring the cosine of the angle of the vectors representing them.
Models of Processing Difficulty
Specifically, the simpler space is based on word co-occurrence counts; it constructs the vector representing a given target word, t, by identifying all the tokens oft in a corpus and recording the counts of context words, 6, (within a specific window).
vector representations is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Cheung, Jackie Chi Kit and Penn, Gerald
Introduction
We tabulate the transitions of entities between different syntactic positions (or their nonoccurrence) in sentences, and convert the frequencies of transitions into a feature vector representation of transition probabilities in the document.
Introduction
We solve this problem in a supervised machine learning setting, where the input is the feature vector representations of the two versions of the document, and the output is a binary value indicating the document with the original sentence ordering.
Introduction
Transition length — the maximum length of the transitions used in the feature vector representation of a document.
vector representations is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: