Index of papers in Proc. ACL 2012 that mention
  • SVD
Zweig, Geoffrey and Platt, John C. and Meek, Christopher and Burges, Christopher J.C. and Yessenalina, Ainur and Liu, Qiang
Sentence Completion via Latent Semantic Analysis
The method is based on applying singular value decomposition ( SVD ) to a matrix W representing the occurrence of words in documents.
Sentence Completion via Latent Semantic Analysis
SVD results in an approximation of W by the product of three matrices, one in which each word is represented as a low—dimensional vector, one in which each document is represented as a low dimensional vector, and a diagonal scaling matrix.
Sentence Completion via Latent Semantic Analysis
An important property of SVD is that the rows of US — which represents the words — behave similarly to the original rows of W, in the sense that the cosine similarity between two rows in US approximates the cosine similarity between the corre—
SVD is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Cohen, Shay B. and Stratos, Karl and Collins, Michael and Foster, Dean P. and Ungar, Lyle
Estimating the Tensor Model
The following lemma justifies the use of an SVD calculation as one method for finding values for U a and Va that satisfy condition 2:
Introduction
These algorithms use spectral methods: that is, algorithms based on eigenvector decompositions of linear systems, in particular singular value decomposition ( SVD ).
Introduction
The first step is to take an SVD of the training examples, followed by a projection of the training examples down to a low-dimensional space.
SVD is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: