Matrix A, Can be expressed as
A = USVt
U,V - Orthogonal
U - Left Singular Vector
V - Right Singular Vector
A is an m × n matrix
U is an m × n orthogonal matrix
S is an n × n diagonal matrix
V is an n × n orthogonal matrix
Since an m × n matrix, where m > n, will have only n singular values, in SVD this is equivalent to solving an m × m matrix using only n singular values.
- Dimensionality reduction is done by neglecting small singular values in the diagonal matrix S
- Feature of dimensionality reduction is only exploited in the decomposed version
Reference - Link
Eigen Vectors
- Satisfy AV(Vector) = L(Eigen Value)V(Eigen Vector)
- Certain Lines stretch don't change direction
- High Dimensional Data (Images, Text, Vector of Stock Data)
- Describe the data with only few values
How Many Singular Values Should We Retain? - A useful rule of thumb is to retain enough singular values to make up 90% of the energy in Σ, Link
SVD - (Application in NLP) - Latent Semantic Analysis Notes
- LSA applies singular value decomposition (SVD) to the matrix
- In SVD, a rectangular matrix is decomposed into the product of three other matrices
- One component matrix describes the original row entities as vectors of derived orthogonal factor values
- Another describes the original column entities in the same way
- Third is a diagonal matrix containing scaling values such that when the three components are matrix-multiplied, the original matrix is reconstructed
- The Dirichlet distribution takes a number (called alpha in most places) for each topic (or category)
Happy Mastering DL!!!
No comments:
Post a Comment