Did you know that a set of vectors that are all orthogonal to each other is called an orthogonal set?
This means that each pair of distinct vectors from the set are perpendicular to each other, and an orthogonal set is linearly independent.
This means that if \(S=\left\{\overrightarrow{a_{1}}, \overrightarrow{a_{2}}, \ldots, \overrightarrow{a_{n}}\right\}\) is an orthogonal set of nonzero vectors in \(\mathbb{R}^{n}\), then \(S\) is linearly independent and is a basis for the subspace spanned by S. Conversely, an orthogonal basis for a subspace is also an orthogonal set.
That’s great, but why do we care? What’s so important about an orthogonal basis?
Understanding Projections
Because a key component involving orthogonality is projection.
What pops into your head when you hear the word “projection”?
Maybe you think of a film or movie displayed on a screen. Or perhaps you imagine a weather forecast of a future event so you can make plans based on those projections?
Well, in mathematics, a projection is a mapping of a set, and it acts like a shadow. The projection of a vector is understood to be the shadow cast or projected from one vector onto another.
Visualizing Projections
Using the image below, we can see the projection (shadow) of vector a onto vector \(b\).
Alright, so visually, we can see that a projection is a “shadow,” but more specifically, it is a perpendicular shadow.
Vector Decomposition with Orthogonal Projections
Let’s take vector \(\vec{y}\) and decompose it to better understand what is happening. All this means is that we can rewrite a vector into the sum of two vectors, where one is a multiple, and the other is orthogonal.
What?
Given a nonzero vector \(\vec{u}\) in \(\mathbb{R}^{n}\), consider vector \(\vec{y}\) in \(\mathbb{R}^{n}\), then \(\vec{y}=\hat{y}+\vec{z}\), where \(\vec{z}=\vec{y}-\hat{y}\) is called the component of vector \(\mathrm{y}\) orthogonal to vector \(\mathrm{u}\), and \(\hat{y}\), pronounced \(\mathrm{y}\)-hat, is called the orthogonal projection of vector \(y\) onto vector \(u\).
In other words, if \(L=\operatorname{Span}\{\vec{u}\}\) is a line, then \(\hat{y}=\left(\operatorname{proj}_{\bar{u}} \vec{y}\right) \vec{u}=\left(\frac{\vec{y} \cdot \vec{u}}{\vec{u} \cdot \vec{u}}\right) \vec{u}\)
Example of Vector Decomposition
For example, suppose \(\vec{y}=\left[\begin{array}{l}6 \\ 5\end{array}\right]\) and \(\vec{u}=\left[\begin{array}{l}3 \\ 1\end{array}\right]\), find the orthogonal projection of \(\vec{y}\) onto \(\vec{y}\) and write \(\vec{y}\) as the sum of two orthogonal vectors, one in \(\operatorname{Span}\{\vec{u}\}\) and one orthogonal to \(\vec{u}\).
First, we will compute the dot products.
\begin{aligned}
\vec{y} \cdot \vec{u} & =\left[\begin{array}{l}
6 \\
5
\end{array}\right] \cdot\left[\begin{array}{l}
3 \\
1
\end{array}\right] \\
& =(6)(3)+(5)(1) \\
& =23 \\
\vec{u} \cdot \vec{u} & =\left[\begin{array}{l}
3 \\
1
\end{array}\right] \cdot\left[\begin{array}{l}
3 \\
1
\end{array}\right] \\
& =(3)(3)+(1)(1) \\
& =10
\end{aligned}
Next, we will substitute our values into our formula to find the orthogonal projection of y onto \(u\)
\begin{aligned}
\hat{y} & =\left(\frac{\vec{y} \cdot \vec{u}}{\vec{u} \cdot \vec{u}}\right) \vec{u} \\
& =\frac{23}{10}\left[\frac{3}{1}\right] \\
& =\left[\begin{array}{c}
69 / 10 \\
23 / 10
\end{array}\right]
\end{aligned}
Now we will find the complement of \(\mathrm{y}\) orthogonal to \(\mathrm{u}\)
\begin{aligned}
\vec{z} & =\vec{y}-\hat{y} \\
& =\left[\begin{array}{c}
6 \\
5
\end{array}\right]-\left[\begin{array}{c}
69 / 10 \\
23 / 10
\end{array}\right] \\
& =\left[\begin{array}{c}
-9 / 10 \\
27 / 10
\end{array}\right]
\end{aligned}
And finally, we will the decomposition of our vector as follows.
\begin{aligned}
\vec{y} & =\hat{y}+\vec{z} \\
& =\left[\begin{array}{l}
6 \\
5
\end{array}\right] \\
& =\left[\begin{array}{c}
69 / 10 \\
23 / 10
\end{array}\right]+\left[\begin{array}{c}
-9 / 10 \\
27 / 10
\end{array}\right]
\end{aligned}
See. It’s easy!
Orthonormal Sets
And this brings us to our next big idea – orthonormal sets.
If a set of vectors, where all the vectors have norms (lengths) of one, it is called a normalized set. And if a set of vectors are both orthogonal and normalized, this is called an orthonormal set because it is a collection of perpendicular unit vectors.
Additionally, matrices whose columns form an orthonormal set are critical in computer algorithms and computations.
Next Steps
In this lesson, you will:
- Dive into the world of orthogonal sets
- Explore orthogonal projections
- Learn about orthonormal bases
- Discover important theorems and facts along the way
Jump right in and start exploring!
Video Tutorial w/ Full Lesson & Detailed Examples
Get access to all the courses and over 450 HD videos with your subscription
Monthly and Yearly Plans Available
Still wondering if CalcWorkshop is right for you?
Take a Tour and find out how a membership can take the struggle out of learning math.