Index of papers in Proc. ACL 2013 that mention
  • distributional semantic
Cheung, Jackie Chi Kit and Penn, Gerald
Abstract
In contrast, vector space models of distributional semantics are trained on large corpora, but are typically applied to domain-general lexical disambiguation tasks.
Abstract
We introduce Distributional Semantic Hidden Markov Models, a novel variant of a hidden Markov model that integrates these two approaches by incorporating contextualized distributional semantic vectors into a generative model as observed emissions.
Distributional Semantic Hidden Markov Models
Unlike in most applications of HMMs in text processing, in which the representation of a token is simply its word or lemma identity, tokens in DSHMM are also associated with a vector representation of their meaning in context according to a distributional semantic model (Section 3.1).
Introduction
By contrast, distributional semantic models are trained on large, domain-general corpora.
Introduction
In this paper, we propose to inject contextualized distributional semantic vectors into generative probabilistic models, in order to combine their complementary strengths for domain modelling.
Introduction
There are a number of potential advantages that distributional semantic models offer.
Related Work
Our work is similar in that we assume much of the same structure within a domain and consequently in the model as well (Section 3), but whereas PROFINDER focuses on finding the “correct” number of frames, events, and slots with a nonparametric method, this work focuses on integrating global knowledge in the form of distributional semantics into a probabilistic model.
distributional semantic is mentioned in 19 sentences in this paper.
Topics mentioned in this paper:
Lazaridou, Angeliki and Marelli, Marco and Zamparelli, Roberto and Baroni, Marco
Abstract
This is a major cause of data sparseness for corpus-based approaches to lexical semantics, such as distributional semantic models of word meaning.
Abstract
Our results constitute a novel evaluation of the proposed composition methods, in which the full additive model achieves the best performance, and demonstrate the usefulness of a compositional morphology component in distributional semantics .
Composition methods
Distributional semantic models (DSMs), also known as vector-space models, semantic spaces, or by the names of famous incarnations such as Latent Semantic Analysis or Topic Models, approximate the meaning of words with vectors that record their patterns of co-occurrence with corpus context features (often, other words).
Composition methods
Since the very inception of distributional semantics , there have been attempts to compose meanings for sentences and larger passages (Landauer and Dumais, 1997), but interest in compositional DSMs has skyrocketed in the last few years, particularly since the influential work of Mitchell and Lapata (2008; 2009; 2010).
Experimental setup
4.2 Distributional semantic space6
Experimental setup
This result is of practical importance for distributional semantics , as it paves the way to address one of the main causes of data sparseness, and it confirms the usefulness of the compositional approach in a new domain.
Experimental setup
We would also like to apply composition to inflectional morphology (that currently lies outside the scope of distributional semantics ), to capture the nuances of meaning that, for example, distinguish singular and plural nouns (consider, e. g., the difference between the mass singular tea and the plural teas, which coerces the noun into a count interpretation (Katz and Zamparelli, 2012)).
Introduction
Distributional semantic models (DSMs) in particular represent the meaning of a word by a vector, the dimensions of which encode corpus-extracted co-occurrence statistics, under the assumption that words that are semantically similar will occur in similar contexts (Tumey and Pantel, 2010).
Introduction
Compositional distributional semantic models (cDSMs) of word units aim at handling, compositionally, the high productivity of phrases and consequent data sparseness.
Related work
Our system, given re- and build, predicts the ( distributional semantic ) meaning of rebuild.
Related work
Another emerging line of research uses distributional semantics to model human intuitions about the semantic transparency of morphologically derived or compound expressions and how these impact various lexical processing tasks (Kuperman, 2009; Wang et al., 2012).
distributional semantic is mentioned in 11 sentences in this paper.
Topics mentioned in this paper: