Word Maturity: Computational Modeling of Word Knowledge
Kireyev, Kirill and Landauer, Thomas K

Article Structure

Abstract

While computational estimation of difficulty of words in the lexicon is useful in many educational and assessment applications, the concept of scalar word difficulty and current corpus—based methods for its estimation are inadequate.

Motivation

It is no surprise that through stages of language learning, different words are learned at different times and are known to different extents.

Rethinking Word Difficulty

Previously, related work in education and psychometrics has been concerned with measuring word difiicully or classifying words into different difficulty categories.

Topics

Latent Semantic

Appears in 5 sentences as: Latent Semantic (5)
In Word Maturity: Computational Modeling of Word Knowledge
  1. We present a computational algorithm for estimating word maturity, based on modeling language acquisition with Latent Semantic Analysis.
    Page 1, “Abstract”
  2. 3 Modeling Word Meaning Acquisition with Latent Semantic Analysis
    Page 2, “Rethinking Word Difficulty”
  3. 3.1 Latent Semantic Analysis (LSA)
    Page 2, “Rethinking Word Difficulty”
  4. An appealing choice for quantitatively modeling word meanings and their growth over time is Latent Semantic Analysis (LSA), an unsupervised method for representing word and document meaning in a multidimensional vector space.
    Page 2, “Rethinking Word Difficulty”
  5. We have also proposed and evaluated an implementation of this metric using Latent Semantic Analysis.
    Page 8, “Rethinking Word Difficulty”

See all papers in Proc. ACL 2011 that mention Latent Semantic.

See all papers in Proc. ACL that mention Latent Semantic.

Back to top.

semantic relatedness

Appears in 3 sentences as: semantic relatedness (1) semantic relationships (1) semantically related (1)
In Word Maturity: Computational Modeling of Word Knowledge
  1. The dimensionality reduction has the effect of smoothing out incidental co-occurrences and preserving significant semantic relationships between words.
    Page 2, “Rethinking Word Difficulty”
  2. The resulting word vectors2 in U are positioned in such a way that semantically related words vectors point in similar directions or, equivalently, have higher cosine values between them.
    Page 2, “Rethinking Word Difficulty”
  3. In addition to merely measuring semantic relatedness , LSA has been shown to emulate the learning of word meanings from natural language (as can be evidenced by a broad range of applications from synonym tests to automated essay grading), at rates that resemble those of human learners (Laundauer et al, 1997).
    Page 2, “Rethinking Word Difficulty”

See all papers in Proc. ACL 2011 that mention semantic relatedness.

See all papers in Proc. ACL that mention semantic relatedness.

Back to top.