Index of papers in Proc. ACL 2010 that mention
  • semantic space
Mitchell, Jeff and Lapata, Mirella and Demberg, Vera and Keller, Frank
Integrating Semantic Constraint into Surprisal
The factor A(wn, h) is essentially based on a comparison between the vector representing the current word wn and the vector representing the prior history h. Varying the method for constructing word vectors (e. g., using LDA or a simpler semantic space model) and for combining them into a representation of the prior context h (e.g., using additive or multiplicative functions) produces distinct models of semantic composition.
Introduction
Expectations are represented by a vector of probabilities which reflects the likely location in semantic space of the upcoming word.
Introduction
The model essentially integrates the predictions of an incremental parser (Roark 2001) together with those of a semantic space model (Mitchell and Lapata 2009).
Method
Following Mitchell and Lapata (2009), we constructed a simple semantic space based on c0-occurrence statistics from the BLLIP training set.
Models of Processing Difficulty
As LSA is one the best known semantic space models it comes as no surprise that it has been used to analyze semantic constraint.
Models of Processing Difficulty
Context is represented by a vector of probabilities which reflects the likely location in semantic space of the upcoming word.
Models of Processing Difficulty
Importantly, composition models are not defined with a specific semantic space in mind, they could easily be adapted to LSA, or simple co-occurrence vectors, or more sophisticated semantic representations (e.g., Griffiths et al.
semantic space is mentioned in 16 sentences in this paper.
Topics mentioned in this paper:
Rudolph, Sebastian and Giesbrecht, Eugenie
CMSMs Encode Symbolic Approaches
From the perspective of our compositionality framework, those approaches employ a group (or pre-group) (G, -) as semantical space S where the group operation (often written as multiplication) is used as composition operation ><.
Compositionality and Matrices
More formally, the underlying idea can be described as follows: given a mapping [[ - ]] : 2 —> S from a set of tokens (words) 2 into some semantical space S (the elements of which we will simply call “meanings”), we find a semantic composition operation ><2 S* —> S mapping sequences of meanings to meanings such that the meaning of a sequence of tokens 0'10'2 .
Compositionality and Matrices
the semantical space consists of quadratic matrices, and the composition operator ><1 coincides with matrix multiplication as introduced in Section 2.
Compositionality and Matrices
This way, abstracting from specific initial mental state vectors, our semantic space S can be seen as a function space of mental transformations represented by matrices, whereby matrix multiplication realizes subsequent execution of those transformations triggered by the input token sequence.
semantic space is mentioned in 4 sentences in this paper.
Topics mentioned in this paper: