Basics
- Line is unidirectional, Square is 2D, Cube is 3D
- Fundamentally shapes are just set of points
- For a N-dimensional space it is represented in N-dimensional hypercube
Feature Extraction
- Converting a feature vector from Higher to lower dimension
PCA (Principal Component Analysis)
- Input is a large number of correlated variables We perform Orthogonal transformation, convert them into uncorrelated variables. We identify principal components based on highest variation
- Orthogonal vector - Dot product equals zero. The components perpendicular to each other
- This is achieved using SVD (Single Value Decomposition)
- SVD internally solves the matrix and identifies the Eigen Vectors
- Eigen vector does not change direction when linear transformation is applied
- PCA is used to explain variations in data. Find principal component with largest variation, Direction with next highest variation (orthogonal for first PCA)
- Rotation or Reflection is referred as Orthogonal Transformation
- PCA - Use components with high variations
- SVD - Express Data as a Matrix
More Reads
Happy Learning!!!
No comments:
Post a Comment