A = U × Σ × Vᵀ — breaking matrices into beautiful pieces
SVD stands for Singular Value Decomposition. It's a mathematical technique that takes any matrix and breaks it into three simpler, structured matrices. Think of it as factoring a matrix — just like how 12 = 3 × 4, any matrix can be written as U × Σ × Vᵀ.
Every SVD breaks matrix A into exactly three components. Each has a precise geometric meaning.
We'll compute SVD on this 2×2 matrix by hand. Every step explained clearly.
det(AᵀA − λI) = 0. Eigenvalues tell us how much "stretch" each direction has.(AᵀA − λI)v = 0 to find eigenvectors. These become columns of V.uᵢ = (1/σᵢ) × A × vᵢEnter any 2×2 matrix and watch SVD computed in real time.
The most powerful trick in SVD: keep only the top-k singular values and throw the rest away. You get an approximation that uses far less memory — but preserves most of the information.
SVD is one of the most versatile tools in all of science and engineering. Here are 12 real, buildable applications — from AI art to finance to space exploration.
Ready-to-run examples. Copy, paste, execute.
| Concept | Description |
|---|---|
| A = U×Σ×Vᵀ | Any matrix A decomposes into left vectors, singular values, right vectors |
| U | m×m orthonormal — row-space directions |
| Σ | m×n diagonal — singular values σ₁ ≥ σ₂ ≥ ... ≥ 0, sorted by importance |
| Vᵀ | n×n orthonormal — column-space directions |
| σᵢ = √λᵢ | Singular values are square roots of eigenvalues of AᵀA |
| Truncated SVD | Keep top-k singular values for compression and noise removal |
| Explained % | σᵢ² / Σσ² × 100 — variance captured by each component |
| SVD vs PCA | PCA = SVD on mean-centered data. Same math, different framing |