Projections

Published on May 12, 2025 by Aman K Sahu

Vector Projections (One Vector onto Another)

Imagine shining a flashlight on a vector v, and it casts a shadow on another vector u. That shadow is the projection of v onto u.

This projection tells us "how much" of v is pointing in the direction of u.

๐Ÿ‘‰ The formula is:
projuv = [(v ยท u) / (u ยท u)] * u

๐Ÿ” Step-by-Step Example:

Letโ€™s take two vectors:
v = [3, 4], and u = [1, 0] (this is a unit vector pointing along the x-axis).

Step 1: Compute the dot product v ยท u = 3 ร— 1 + 4 ร— 0 = 3
Step 2: Compute u ยท u = 1 ร— 1 + 0 ร— 0 = 1
Step 3: Plug into formula:
projuv = (3 / 1) * [1, 0] = [3, 0]

โœ… So the projection of v onto u is [3, 0]. It lies entirely along the x-axis.

๐Ÿ’ก This helps us break a vector into parts โ€” one along a direction and one perpendicular to it.

Orthogonal Projections onto Subspaces

Now letโ€™s go one level up: projecting a vector not just onto another vector, but onto a whole subspace (like a plane or space spanned by multiple vectors).

The idea is the same: we want to find the "shadow" of a vector b onto the subspace formed by the columns of a matrix A.

๐Ÿ“ The formula is:
p = A(Aแต—A)โˆ’1Aแต—b
where:

  • b is the vector being projected
  • A contains the vectors defining the subspace (as columns)
  • p is the projection of b onto the subspace

๐Ÿ” What it means:

We are finding the closest vector p to b that lies in the subspace (spanned by A). The difference between b and p is completely perpendicular (orthogonal) to the subspace.

๐Ÿ’ก This is super useful in real-life problems where we want the best "approximate" solution โ€” like in linear regression (least squares method).

Why Are Projections Important?

  • ๐Ÿ“‰ In least squares, we use projections to find the best approximate solution to overdetermined systems (more equations than unknowns).
  • ๐ŸŽฎ In graphics and gaming, projections help simulate 3D views on a 2D screen (like shadows and camera views).
  • ๐Ÿ“Š In machine learning, projections reduce dimensions โ€” for example, using PCA (Principal Component Analysis).
  • ๐Ÿง  In Gram-Schmidt Process, projections help construct orthonormal bases.

๐Ÿ”— Projections are a building block for understanding many advanced topics in linear algebra, data science, and computer science!

Previous: Determinant, Rank, Nullity
Next: LU Decomposition