Index of papers in Proc. ACL 2010 that mention
  • meaning representations
Titov, Ivan and Kozhevnikov, Mikhail
A Model of Semantics
tradiction is trivial: two meaning representations
A Model of Semantics
As soon as the meaning representations m* are inferred, we find ourselves in the setup studied in (Liang et al., 2009): the state 3 is no longer latent and we can run efficient inference on the E—step.
Inference with NonContradictory Documents
In this section we will describe our inference method on a higher conceptual level, not specifying the underlying meaning representation and the probabilistic model.
Inference with NonContradictory Documents
garded as defining the probability distribution of meaning m and its alignment a with the given text w, P(m, a, w) = P(a, The semantics m can be represented either as a logical formula (see, e.g., (Poon and Domingos, 2009)) or as a set of field values if database records are used as a meaning representation (Liang et al., 2009).
Inference with NonContradictory Documents
ings (m1,..., m K) that Ailmi is not satisfiable,2 and models dependencies between components in the composite meaning representation (e.g., arguments values of predicates).
Introduction
The supervision was either given in the form of meaning representations aligned with sentences (Zettlemoyer and Collins, 2005; Ge and Mooney, 2005; Mooney, 2007) or in a somewhat more relaxed form, such as lists of candidate meanings for each sentence (Kate and Mooney, 2007; Chen and Mooney, 2008) or formal representations of the described world state for each text (Liang et al., 2009).
Introduction
However, it is important to note that the phrase “wind from west” may still appear in the texts, but in reference to other time periods, underlying the need for modeling alignment between grouped texts and their latent meaning representation .
meaning representations is mentioned in 10 sentences in this paper.
Topics mentioned in this paper:
Mitchell, Jeff and Lapata, Mirella and Demberg, Vera and Keller, Frank
Introduction
The latter creates meaning representations compositionally, and therefore builds semantic expectations for word sequences (e. g., phrases, sentences, even documents) rather than isolated words.
Models of Processing Difficulty
To give a concrete example, Latent Semantic Analysis (LSA, Landauer and Dumais 1997) creates a meaning representation for words by constructing a word-document co-occurrence matrix from a large collection of documents.
Models of Processing Difficulty
Their aim is not so much to model processing difficulty, but to construct vector-based meaning representations that go beyond individual words.
Models of Processing Difficulty
We also examine the influence of the underlying meaning representations by comparing a simple semantic space similar to McDonald (2000) against Latent Dirichlet Allocation (Blei et al.
meaning representations is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Thater, Stefan and Fürstenau, Hagen and Pinkal, Manfred
Introduction
We go one step further, however, in that we employ syntactically enriched vector models as the basic meaning representations , assuming a vector space spanned by combinations of dependency relations and words (Lin, 1998).
Related Work
Mitchell and Lapata (2008), henceforth M&L, propose a general framework in which meaning representations for complex expressions are computed compositionally by combining the vector representations of the individual words of the complex expression.
The model
As soon as we want to compute a meaning representation for a phrase like acquire knowledge from the verb acquire together with its direct object knowledge, we are facing the problem that verbs have different syntactic neighbors than nouns, hence their first-order vectors are not easily comparable.
The model
We let the first-order vector with its selectional preference information act as a kind of weighting filter on the second-order vector, and thus refine the meaning representation of the verb.
meaning representations is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Abney, Steven and Bird, Steven
Human Language Project
sus on parse trees is difficult, obtaining consensus on meaning representations is impossible.
Human Language Project
However, if the language under consideration is anything other than English, then a translation into English (or some other reference language) is for most purposes a perfectly adequate meaning representation .
Human Language Project
Taking sentences in a reference language as the meaning representation , we arrive back at machine translation as the measure of success.
meaning representations is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: