Vector Projections (One Vector onto Another)
Imagine shining a flashlight on a vector v, and it casts a shadow on another vector u. That shadow is the projection of v onto u.
This projection tells us "how much" of v is pointing in the direction of u.
๐ The formula is:
projuv = [(v ยท u) / (u ยท u)] * u
๐ Step-by-Step Example:
Letโs take two vectors:
v = [3, 4], and u = [1, 0] (this is a unit vector pointing
along the x-axis).
Step 1: Compute the dot product v ยท u = 3 ร 1 + 4 ร 0 = 3
Step 2: Compute u ยท u = 1 ร 1 + 0 ร 0 = 1
Step 3: Plug into formula:
projuv = (3 / 1) * [1, 0] = [3, 0]
โ
So the projection of v onto u is [3, 0]. It
lies entirely along the x-axis.
๐ก This helps us break a vector into parts โ one along a direction and one perpendicular to it.
Orthogonal Projections onto Subspaces
Now letโs go one level up: projecting a vector not just onto another vector, but onto a whole subspace (like a plane or space spanned by multiple vectors).
The idea is the same: we want to find the "shadow" of a vector b onto the subspace formed by the columns of a matrix A.
๐ The formula is:
p = A(AแตA)โ1Aแตb
where:
- b is the vector being projected
- A contains the vectors defining the subspace (as columns)
- p is the projection of b onto the subspace
๐ What it means:
We are finding the closest vector p to b that lies in the subspace (spanned by A). The difference between b and p is completely perpendicular (orthogonal) to the subspace.
๐ก This is super useful in real-life problems where we want the best "approximate" solution โ like in linear regression (least squares method).
Why Are Projections Important?
- ๐ In least squares, we use projections to find the best approximate solution to overdetermined systems (more equations than unknowns).
- ๐ฎ In graphics and gaming, projections help simulate 3D views on a 2D screen (like shadows and camera views).
- ๐ In machine learning, projections reduce dimensions โ for example, using PCA (Principal Component Analysis).
- ๐ง In Gram-Schmidt Process, projections help construct orthonormal bases.
๐ Projections are a building block for understanding many advanced topics in linear algebra, data science, and computer science!