Singular value decomposition

A = S $V^{-1}$ = S $V^{T}$

where V is the matrix of decomposition axes, S is the matrix of the len of projections.

Any set of vectors (A) can be expressed in terms of their lengths of projections (S) on some set of orthogonal axes (V).

image

First, let’s name things. Recall that any pair of orthogonal vectors in two-dimensional space forms a basis for that space. In our case, let’s call the orthogonal vectors in the input space $v1$ and $v2$. After applying a matrix transformation MMM to these two vectors, we get $Mv1$ and $Mv2$ . Furthermore, let’s decompose these two transformed vectors into unit vectors, $u_{1}$ and $u_{1}$ , times their respective magnitudes, $\sigma_{1}$ and $\sigma_{2}$. In other words:

Mv1=u1σ1

Mv2=u2σ2

We can consider v1 and v2 are the original basis, while u1 and u2 are the transformed basis vector and $\sigma_{1}$ and $\sigma_{2}$ are corresponding magnitude of new basis

But now that we have things named, we can do a little algebraic manipulation. First, note that any vector x\textbf{x}x can be described using the basis vectors v1 and v2 in the following way,

$x=(x⋅v1)v1+(x⋅v2)v2$

where $a⋅b$ denotes the dot product between vectors a and b. If you’re unsure about the above formulation, consider a similar formulation with the standard basis vectors:

$x=x1[1, 0]+x2[0, 1]$

where xi denotes the i-th component of x. In Equation 333, we are projecting, via the dot product, x onto v1 and v2 before decomposing the terms using the basis vectors v1 and v2.

Next, let’s left-multiply both sides of Equation 3 by our matrix transformation M. Since the dot product produces a scalar, we can distribute and commute M as follows:

Mx=(x⋅v1)Mv1+(x⋅v2)Mv2

Next, we can rewrite Mvi as uiσi:

Mx=(x⋅v1)u1σ1+(x⋅v2)u2σ2

We’re almost there. Now since the dot product is commutative, we can switch the ordering, e.g. $x⋅v1=x⊤v1=v1⊤x$. And since each dot product term is a scalar, we move each one to the end of their respective expressions:

Mx=u1σ1v1⊤x+u2σ2v2⊤x

M=u1σ1v1⊤+u2σ2v2⊤

Given appropriately defined matrices, Equation 4 becomes SVD in its canonical form (Equation 1) for $2×2$ matrices:

M = [ u1 u2 ] x [ $\sigma1$ 0 ] x [ $v_{1}^{T}$ ]
                [ 0 $\sigma2$ ]   [ $v_{2}^{T}$ ]

Furthermore, you should have an intuition for what this means. Any matrix transformation can be represented as a diagonal transformation (dilating, reflecting) defined by $\Sigma$ provided the domain and range are properly rotated first.

The vectors ui are called the left singular vectors, while the vectors vi are called the right singular vectors. This orientating terminology is a bit confusing because “left” and “right” come from the equation above, while in our diagrams of rectangles and ellipses, the vi vectors are on the left.

Standard view

Ref: https://gregorygundersen.com/blog/2018/12/10/svd/