ELIAS EBNER

  • Home
  • Blog
  • Courses
  • About

Linear Algebra

  • Vectors
    • Scalars
    • Vectors vs Sets
    • Addition and Subtraction
    • Scalar Multiplication
    • Zero Vectors
    • Linear Combinations
    • Real Dot Product
    • Length of a Vector
    • Orthogonal Vectors
    • Parallel Vectors

Parallel Vectors

In this lesson, I would like to explore the intuitive understanding, but also get into a more rigorous explanation of parallel vectors.

We immediately observe that a vector that is parallel to another, is just a scaled version of the vector:

You can look at my lesson about scalar multiplication to find out more about β€œscaling” vectors.

With this knowledge, we can already see that if we can find some scalar sss, such that

v⃗=sw⃗,\vec{v} = s\vec{w},v=sw,

it must be the case that v⃗\vec{v}v and w⃗\vec{w}w are parallel.

If you are interested, I also have an explanation which goes a little deeper.

Algebraic Approach

At the time of writing, I haven’t yet created a course about trigonometry, so I don’t have any lessons to refer you to. For now, I will just assume some basic knowledge of trigonometry.

Look at this figure:

We suppose that both vectors point in the same direction. From trigonometry, we can say that

{v1=βˆ₯vβƒ—βˆ₯ cos⁑αv2=βˆ₯vβƒ—βˆ₯ sin⁑α,\begin{cases} v_1 = \lVert \vec{v} \rVert \, \cos\alpha \\ v_2 = \lVert \vec{v} \rVert \, \sin\alpha, \end{cases}{v1​=βˆ₯vβˆ₯cosΞ±v2​=βˆ₯vβˆ₯sinΞ±,​

and the same thing for w⃗\vec{w}w

{w1=βˆ₯wβƒ—βˆ₯ cos⁑αw2=βˆ₯wβƒ—βˆ₯ sin⁑α.\begin{cases} w_1 = \lVert \vec{w} \rVert \, \cos\alpha \\ w_2 = \lVert \vec{w} \rVert \, \sin\alpha. \end{cases}{w1​=βˆ₯wβˆ₯cosΞ±w2​=βˆ₯wβˆ₯sinΞ±.​

If you don’t know what βˆ₯xβƒ—βˆ₯\lVert \vec{x} \rVertβˆ₯xβˆ₯ means, you can read through my lesson on the length of a vector.

From the equations above, we know that

{cos⁑α=v1βˆ₯vβƒ—βˆ₯sin⁑α=v2βˆ₯vβƒ—βˆ₯,\begin{cases} \cos\alpha = \dfrac{v_1}{\lVert \vec{v} \rVert} \\[1em] \sin\alpha = \dfrac{v_2}{\lVert \vec{v} \rVert}, \end{cases}βŽ©βŽ¨βŽ§β€‹cosΞ±=βˆ₯vβˆ₯v1​​sinΞ±=βˆ₯vβˆ₯v2​​,​

and we can substitute in the other two equations:

{w1=βˆ₯wβƒ—βˆ₯ cos⁑αw2=βˆ₯wβƒ—βˆ₯ sin⁑α.{w1=βˆ₯wβƒ—βˆ₯v1βˆ₯vβƒ—βˆ₯w2=βˆ₯wβƒ—βˆ₯v2βˆ₯vβƒ—βˆ₯{w1=βˆ₯wβƒ—βˆ₯βˆ₯vβƒ—βˆ₯v1w2=βˆ₯wβƒ—βˆ₯βˆ₯vβƒ—βˆ₯v2.\begin{cases} w_1 = \lVert \vec{w} \rVert \, \cos\alpha \\ w_2 = \lVert \vec{w} \rVert \, \sin\alpha. \end{cases} \\ \begin{cases} w_1 = \lVert \vec{w} \rVert \dfrac{v_1}{\lVert \vec{v} \rVert} \\[1em] w_2 = \lVert \vec{w} \rVert \dfrac{v_2}{\lVert \vec{v} \rVert} \end{cases} \\ \begin{cases} w_1 = \dfrac{\lVert \vec{w} \rVert}{\lVert \vec{v} \rVert} v_1 \\[1em] w_2 = \dfrac{\lVert \vec{w} \rVert}{\lVert \vec{v} \rVert} v_2. \end{cases}{w1​=βˆ₯wβˆ₯cosΞ±w2​=βˆ₯wβˆ₯sinΞ±.β€‹βŽ©βŽ¨βŽ§β€‹w1​=βˆ₯wβˆ₯βˆ₯vβˆ₯v1​​w2​=βˆ₯wβˆ₯βˆ₯vβˆ₯v2β€‹β€‹β€‹βŽ©βŽ¨βŽ§β€‹w1​=βˆ₯vβˆ₯βˆ₯wβˆ₯​v1​w2​=βˆ₯vβˆ₯βˆ₯wβˆ₯​v2​.​

Therefore, according to the definition of scalar multiplication, our vector w⃗\vec{w}w is really just

wβƒ—=[βˆ₯wβƒ—βˆ₯βˆ₯vβƒ—βˆ₯v1βˆ₯wβƒ—βˆ₯βˆ₯vβƒ—βˆ₯v2]=βˆ₯wβƒ—βˆ₯βˆ₯vβƒ—βˆ₯vβƒ—.\begin{align*} \vec{w} &= \begin{bmatrix} \dfrac{\lVert \vec{w} \rVert}{\lVert \vec{v} \rVert} v_1 \\[1em] \dfrac{\lVert \vec{w} \rVert}{\lVert \vec{v} \rVert} v_2 \end{bmatrix} \\ &= \dfrac{\lVert \vec{w} \rVert}{\lVert \vec{v} \rVert} \vec{v}. \end{align*}w​=​βˆ₯vβˆ₯βˆ₯wβˆ₯​v1​βˆ₯vβˆ₯βˆ₯wβˆ₯​v2​​​=βˆ₯vβˆ₯βˆ₯wβˆ₯​v.​

If we see βˆ₯wβƒ—βˆ₯βˆ₯vβƒ—βˆ₯\dfrac{\lVert \vec{w} \rVert}{\lVert \vec{v} \rVert}βˆ₯vβˆ₯βˆ₯wβˆ₯​ as some scalar sss, that means that two vectors vβƒ—\vec{v}v and wβƒ—\vec{w}w are parallel, if and only if

w⃗=sv⃗,\vec{w} = s\vec{v},w=sv,

or similarly

vβƒ—=1swβƒ—,\vec{v} = \dfrac{1}{s}\vec{w},v=s1​w,

where 1s\dfrac{1}{s}s1​ is still a scalar.

Does that look familiar?

This is not a rigorous proof, just a quick algebraic demonstration. The idea of a vector being parallel to another only when they’re scaled versions of eachother is actually the way parallelism is defined in linear algebra.

This also extends to higher dimensions.

Zero Vector

According to this definition, the zero vector is parallel (as well as orthogonal) to all other vectors, since for all vectors v⃗\vec{v}v

0⃗=0v⃗.\vec{0} = 0 \vec{v}.0=0v.
This website does not collect personal data, does not use cookies, and does not perform any tracking.