Lecture 2: Operations with Matrices and Vectors

Instructions

• This section covers the concepts listed below. You can click the button to go directly to that topic.
• After each video, there are notes for the material in the video.
• When you have finished the material below, you can go to the next section or return to the main page.
Motivation from System of Equations

To see the full video page and find related videos, click the following link.

Video Notes

In the last lecture, we started with the equation:
\begin{align*}
c_1\begin{pmatrix}1\\1\\1\end{pmatrix}
+ c_2\begin{pmatrix}1\\0\\-1\end{pmatrix}
+ c_3\begin{pmatrix}0\\1\\-1\end{pmatrix}

\begin{pmatrix}
2\\5\\2
\end{pmatrix}.
\end{align*}
Forming linear combinations is an important operation on vectors, and a special notation has been developed to express it more compactly and to allow us to study properties of the vectors of which we are taking linear combinations. This is matrix–vector multiplication. We express the left–hand–side of the above equation as matrix vector multiplication as:
\begin{align*}
\begin{pmatrix}
1 & 1 & 0\\
1 & 0 & 1\\
1 & -1 & -1
\end{pmatrix}
\begin{pmatrix}
c_1\\c_2\\c_3
\end{pmatrix}.
\end{align*}
Observe that the columns of the matrix are the vectors we are taking the linear combination of.

Matrices

Note: This "Matrices" section does not have a video, but only gives a few basic definitions.

An $$n\times n$$ matrix is an $$n\times n$$ "box'' of numbers. For example, the matrix belowis a $$2\times 2$$ matrix:
\begin{align*}
\begin{pmatrix}
1 & -4 \\ 5 & 0
\end{pmatrix}.
\end{align*}
And the matrix below is a $$3\times 3$$ matrix:
\begin{align*}
\begin{pmatrix}
1 & -4 & 6\\ 5 & 0 & 6\\ 2 & 1 & -1
\end{pmatrix}.
\end{align*}
Matrices can be "rectangular" (e.g. the one below is $$2\times 3$$)
\begin{align*}
\begin{pmatrix}
1 & -2 & 8\\ 0 & 1 & 0
\end{pmatrix}
\end{align*}
But in ODEs square matrices are the ones that come up and so those are the ones we will deal with here.

A better way to think about an $$n\times n$$ matrix (at least for our purposes) is to think of it as $$n$$ vectors in $$\mathbb{R}^n$$. So, the matrix:
\begin{align*}
\begin{pmatrix}
1 & -4 & 6\\ 5 & 0 & 6\\ 2 & 1 & -1
\end{pmatrix}
\end{align*}
can be thought of as an arrangement of the vectors:
\begin{align*}
\begin{pmatrix}1\\5\\2\end{pmatrix},
\begin{pmatrix}-4\\0\\1\end{pmatrix},
\begin{pmatrix}6\\6\\-1\end{pmatrix}.
\end{align*}
Note: order matters.

Similarly, if:
\begin{align*}
\boldsymbol{v}_1 = \begin{pmatrix}1\\5\end{pmatrix}
\hspace{.5in}
\boldsymbol{v}_2 = \begin{pmatrix}-4\\0\end{pmatrix}
\end{align*}
Then the matrix $$\begin{pmatrix}\boldsymbol{v}_1 & \boldsymbol{v}_2\end{pmatrix}$$ means:
\begin{align*}
\begin{pmatrix}
1 & -4\\5 & 0
\end{pmatrix}.
\end{align*}
Returning to the equation in the beginning, we write the equation as the matrix–vector equation:
\begin{align*}
\begin{pmatrix}
1 & 1 & 0\\
1 & 0 & 1\\
1 & -1 & -1
\end{pmatrix}
\begin{pmatrix}
c_1\\c_2\\c_3
\end{pmatrix}
=\begin{pmatrix}2\\5\\2\end{pmatrix}.
\end{align*}
Matrix-Vector Multiplication

To see the full video page and find related videos, click the following link.

Video Notes

In general, if $$\boldsymbol{w}=\begin{pmatrix}a_1\\\vdots\\a_n\end{pmatrix}$$ is a vector in $$\mathbb{R}^n$$ and $$M=\begin{pmatrix}\boldsymbol{v}_1 & \cdots & \boldsymbol{v}_n\end{pmatrix}$$ is an $$n\times n$$ matrix, then:
\begin{align*}
M\boldsymbol{w}
= a_1 \boldsymbol{v}_1 + \cdots + a_n \boldsymbol{v}_n.
\end{align*}
Example. Compute
\begin{align*}
\begin{pmatrix}
1 & 1\\ 4 & 1
\end{pmatrix}
\begin{pmatrix}2\\4\end{pmatrix}.
\end{align*}
Solution. \begin{align*}
\begin{pmatrix}
1 & 1\\ 4 & 1
\end{pmatrix}
\begin{pmatrix}2\\4\end{pmatrix}
&= 2 \begin{pmatrix}1\\4\end{pmatrix}
+ 4 \begin{pmatrix}1\\1\end{pmatrix}\\[4mm]
&= \begin{pmatrix}2\\8\end{pmatrix}
+ \begin{pmatrix}4\\4\end{pmatrix} \\[4mm]
&= \begin{pmatrix}6\\12\end{pmatrix}.
\end{align*}
Example. Compute
\begin{align*}
\begin{pmatrix}
1 & 5 & -1 & 2\\
1 & -1 & 1 & 5\\
2 & 1 & 1 & 1\\
2 & 0 & 0 & 1
\end{pmatrix}
\begin{pmatrix}
1\\0\\1\\5
\end{pmatrix}.
\end{align*}
Solution.
\begin{align*}
\begin{pmatrix}
1 & 5 & -1 & 2\\
1 & -1 & 1 & 5\\
2 & 1 & 1 & 1\\
2 & 0 & 0 & 1
\end{pmatrix}
\begin{pmatrix}
1\\0\\1\\5
\end{pmatrix}
&=
1\begin{pmatrix}1\\1\\2\\2\end{pmatrix}
+ 0\begin{pmatrix}5\\-1\\1\\0\end{pmatrix}
+ 1\begin{pmatrix}-1\\1\\1\\0\end{pmatrix}
+ 5\begin{pmatrix}2\\5\\1\\1\end{pmatrix}
\\[4mm]&=\begin{pmatrix}1\\1\\2\\2\end{pmatrix}
+ \begin{pmatrix}-1\\1\\1\\0\end{pmatrix}
+ \begin{pmatrix}10\\25\\5\\5\end{pmatrix}
\\[4mm]&= \begin{pmatrix}10\\27\\8\\7\end{pmatrix}.
\end{align*}
Eigenvectors

To see the full video page and find related videos, click the following link.

Video Notes

If $$M$$ is an $$n\times n$$ matrix, then a number $$\lambda$$ is called an eigenvalue of $$M$$ if there is a non–zero vector $$\boldsymbol{v}$$ that satisfies:
\begin{align*}
M\boldsymbol{v} = \lambda \boldsymbol{v}.
\end{align*}
The vectors $$\boldsymbol{v}$$ is called an eigenvector

Example. Find eigenvalues and eigenvectors of the matrix:
\begin{align*}
\begin{pmatrix}
0 & 1 & 1\\ 1 & 0 & 1\\ 1 & 1 & 0
\end{pmatrix}.
\end{align*}
We haven't learned how to do this, but this is the matrix from the above example. Observe:
\begin{align*}
\begin{pmatrix}
0 & 1 & 1\\ 1 & 0 & 1\\ 1 & 1 & 0
\end{pmatrix}
\begin{pmatrix}1\\1\\1\end{pmatrix}
=\begin{pmatrix}2\\2\\2\end{pmatrix}
= 2\begin{pmatrix}1\\1\\1\end{pmatrix}.
\end{align*}
So $$\begin{pmatrix}1\\1\\1\end{pmatrix}$$ is an eigenvector with $$2$$ as the eigenvalue.
\begin{align*}
\begin{pmatrix}
0 & 1 & 1\\ 1 & 0 & 1\\ 1 & 1 & 0
\end{pmatrix}
\begin{pmatrix}1\\0\\-1\end{pmatrix}
=\begin{pmatrix}-1\\0\\1\end{pmatrix}
= -1\begin{pmatrix}1\\0\\-1\end{pmatrix}.
\end{align*}
So $$\begin{pmatrix}1\\0\\-1\end{pmatrix}$$ is an eigenvector with $$-1$$ as the eigenvalue.
\begin{align*}
\begin{pmatrix}
0 & 1 & 1\\ 1 & 0 & 1\\ 1 & 1 & 0
\end{pmatrix}
\begin{pmatrix}0\\1\\-1\end{pmatrix}
=\begin{pmatrix}0\\-1\\1\end{pmatrix}
= -1\begin{pmatrix}0\\1\\-1\end{pmatrix}.
\end{align*}
So $$\begin{pmatrix}0\\1\\-1\end{pmatrix}$$ is an eigenvector with $$-1$$ as the eigenvalue.
Algebra and Calculus with Vectors and Matrices

To see the full video page and find related videos, click the following link.

Video Notes

If two matrices $$W$$ and $$V$$ have the same dimensions, we can define their sum in the following way. If $$V = \begin{pmatrix}\boldsymbol{v}_1\cdots\boldsymbol{v}_n\end{pmatrix}$$  and $$W = \begin{pmatrix}\boldsymbol{w}_1\cdots\boldsymbol{w}_n\end{pmatrix}$$ then their sum is:

\begin{align*}
V+W
= \begin{pmatrix}\boldsymbol{v}_1 + \boldsymbol{w}_1 \cdots \boldsymbol{v}_n + \boldsymbol{w}_n \end{pmatrix}.
\end{align*}
That is, we just add the corresponding entries.

Example. Compute
\begin{align*}
\begin{pmatrix}
1 & 2\\ -1 & 7
\end{pmatrix}
+
\begin{pmatrix}
2 & 0 \\ 2 & 5
\end{pmatrix}.
\end{align*}

Solution.
\begin{align*}
\begin{pmatrix}
1 & 2\\ -1 & 7
\end{pmatrix}
+
\begin{pmatrix}
2 & 0 \\ 2 & 5
\end{pmatrix}
=
\begin{pmatrix}
3 & 2\\ 1 & 12
\end{pmatrix}.
\end{align*}
In addition to defining multiplication of a matrix and a vector, we can also define multiplication of two matrices. Let $$V = \begin{pmatrix}\boldsymbol{v}_1\cdots\boldsymbol{v}_n\end{pmatrix}$$  and $$W = \begin{pmatrix}\boldsymbol{w}_1\cdots\boldsymbol{w}_n\end{pmatrix}$$ where all vectors are in $$\mathbb{R}^n$$ (note: the product of two matrices with different dimensions is also important, but since we're only working with square matrices in this class, that's all that we will look at now). Then the product $$VW$$ is defined by:
\begin{align*}
VW
=\begin{pmatrix}
Vw_1 \cdots Vw_n
\end{pmatrix}.
\end{align*}
Example. Compute
\begin{align*}
\begin{pmatrix}
1 & 2\\ -1 & 7
\end{pmatrix}
\begin{pmatrix}
2 & 0 \\ 2 & 5
\end{pmatrix}.
\end{align*}
Solution.
\begin{align*}
\begin{pmatrix}
1 & 2\\ -1 & 7
\end{pmatrix}
\begin{pmatrix}
2 & 0 \\ 2 & 5
\end{pmatrix}
=
\begin{pmatrix}V\begin{pmatrix}2\\2\end{pmatrix} V\begin{pmatrix}0\\5\end{pmatrix}\end{pmatrix}
= \begin{pmatrix}
6 & 10\\ 12 & 35
\end{pmatrix}.
\end{align*}
Example. Compute
\begin{align*}
\begin{pmatrix}
2 & 0 \\ 2 & 5
\end{pmatrix}
\begin{pmatrix}
1 & 2\\ -1 & 7
\end{pmatrix}.
\end{align*}
Solution.
\begin{align*}
\begin{pmatrix}
2 & 0 \\ 2 & 5
\end{pmatrix}
\begin{pmatrix}
1 & 2\\ -1 & 7
\end{pmatrix}
= \begin{pmatrix}W\begin{pmatrix}1\\-1\end{pmatrix} W\begin{pmatrix}2\\7\end{pmatrix} \end{pmatrix}
= \begin{pmatrix}2 & 4\\ -3 & 39\end{pmatrix}
\end{align*}

To see the full video page and find related videos, click the following link.

Video Notes

Matrix–vector and matrix–matrix multiplication satisfy many familiar algebraic properties that scalar–scalar multiplication satisfies.

First, working with matrix–vector multiplication, we have the following. Let $$M$$ be an $$n\times n$$ matrix, $$\boldsymbol{v}$$ and $$\boldsymbol{w}$$ are vectors in $$\mathbb{R}^n$$ and $$c$$ a scalar. Then:
\begin{align*}
M(c\boldsymbol{v}) &= cM\boldsymbol{v}\\
M(\boldsymbol{v} + \boldsymbol{w}) &= M\boldsymbol{v} + M\boldsymbol{w}
\end{align*}

To see the full video page and find related videos, click the following link.

Video Notes

If $$M, N$$ and $$R$$ are $$n\times n$$ matrices and $$\boldsymbol{v}$$ and $$\boldsymbol{w}$$ are in $$\mathbb{R}^n$$ and $$c$$ and $$d$$ are scalars then:
\begin{align*}
(M + N) \boldsymbol{v} &= M\boldsymbol{v} + N\boldsymbol{v}\\
M(\boldsymbol{v} + \boldsymbol{w}) &= M\boldsymbol{v} + M\boldsymbol{w}\\
M(N\boldsymbol{v}) &= (MN)\boldsymbol{v} = MN\boldsymbol{v}\\
(cM)\boldsymbol{v} &= c(M\boldsymbol{v})\\
M+N &= N+M\\
(M+N)R &= MR + NR\\
M(N+R) &= MN + MR\\
M(NR) &= (MN)R\\
(c+d)M &= cM + dM\\
c(M+N) &= cM + cN.
\end{align*}

To see the full video page and find related videos, click the following link.

Video Notes

There are two "special" scalars: $$0$$ and $$1$$. These are special because they are what we call "identities" for
addition and multiplication respectivley. This means that:
\begin{align*}
a+0 &= a\\
a\times 1 &= a\\
\end{align*}
for all real $$a$$. In addition, $$0$$ has a special property relating to multiplication:
\begin{align*}
0a = a
\end{align*}
for all real $$a$$. There are similarly two special $$n\times n$$ matrices. One is called the identity, $$I$$ and one is called the
zero matrix, $$\boldsymbol{0}$$. They are:
\begin{align*}
I = \begin{pmatrix}
1 & 0 & \cdots & 0\\
0 & 1 & \cdots & 0\\
\vdots & \vdots & \ddots & 0\\
0 & 0 & \cdots & 1
\end{pmatrix}
\hspace{.5in}
\boldsymbol{0} = \begin{pmatrix}
0 & 0 & \cdots & 0\\
0 & 0 & \cdots & 0\\
\vdots & \vdots & \ddots & 0\\
0 & 0 & \cdots & 0
\end{pmatrix}
\end{align*}
$$I$$ is the multiplicative identity and satisfies
\begin{align*}
I\boldsymbol{v} = \boldsymbol{v},
\end{align*}
for any $$\boldsymbol{v}$$ in $$\mathbb{R}^n$$. And for for all $$n\times n$$ matrices $$M$$:
\begin{align*}
MI = IM = M.
\end{align*}
To understand the identity, begin with the $$I\boldsymbol{v}$$ claim. If $$\boldsymbol{v}=\begin{pmatrix}x_1\\\vdots\\x_n\end{pmatrix}$$ then:
\begin{align*}
I\boldsymbol{v}
= x_1\begin{pmatrix}1\\0\\\vdots\\0\end{pmatrix}
+ x_2\begin{pmatrix}0\\1\\\vdots\\0\end{pmatrix}
+ \cdots
+ x_n\begin{pmatrix}0\\0\\\vdots\\1\end{pmatrix}
= \begin{pmatrix}x_1\\x_2\\\vdots\\x_n\end{pmatrix}
=\boldsymbol{v}.
\end{align*}
And then since matrix–matrix multiplication is defined in terms of matrix–vector multiplication, this implies the
claim $$IM = M$$.

And $$\boldsymbol{0}$$ is the additive identity and satisfies:
\begin{align*}
M + \boldsymbol{0} = \boldsymbol{0} + M = M
\end{align*}
for all $$n\times n$$ matrices $$M$$. In addition, it satisfies:
\begin{align*}
0\boldsymbol{v} = \boldsymbol{0}
\end{align*}
for all $$\boldsymbol{v}$$ in $$\mathbb{R}^n$$ and
\begin{align*}
\boldsymbol{0}M = M\boldsymbol{0} = \boldsymbol{0}
\end{align*}
for all $$n\times n$$ matrices $$M$$.

To see the full video page and find related videos, click the following link.

Video Notes

Often, we will be dealing with matrices and vectors that have entries that are functions of some variable - say $$t$$. Then it makes sense to define the integral and derivatives of such objects. If
\begin{align*}
\boldsymbol{x}(t)
= \begin{pmatrix}x_1(t)\\\vdots\\x_n(t)\end{pmatrix}
\end{align*}
is a vector–valued function, then we define its derivative as follows:
\begin{align*}
\boldsymbol{x}'(t)
=\lim_{h\to 0}\frac{1}{h}(\boldsymbol{x}(t+h) - \boldsymbol{x}(t))
= \begin{pmatrix}\lim_{h\to 0}\frac{x_1(t+h) - x_1(t)}{h}\\\vdots\\\lim_{h\to 0}\frac{x_n(t+h) - x_n(t)}{h}\end{pmatrix}
= \begin{pmatrix}x_1'(t)\\\vdots\\x_n'(t)\end{pmatrix}.
\end{align*}
And its integral is defined by:
\begin{align*}
\int_{t=a}^{b}\boldsymbol{x}(t)dt
=\begin{pmatrix}\int_{t=a}^{b}x_1(t)dt\\\vdots\\\int_{t=a}^{b}x_n(t)dt\end{pmatrix}.
\end{align*}

Similarly, if $$\boldsymbol{M}(t) = \begin{pmatrix}\boldsymbol{v}_1(t)&\cdots&\boldsymbol{v}_n(t)\end{pmatrix}$$ is a matrix–valued function then:
\begin{align*}
\boldsymbol{M}'(t)
&= \begin{pmatrix}\boldsymbol{v}_1'(t)&\cdots&\boldsymbol{v}_n'(t)\end{pmatrix}\\
\\\int_{t=a}^{b}\boldsymbol{M}(t)dt
&=
\begin{pmatrix}\int_{t=a}^{b}\boldsymbol{v}_1(t)dt&\cdots&\int_{t=a}^{b}\boldsymbol{v}_n(t)dt\end{pmatrix}.
\end{align*}

Example. Compute $$\int_{t=0}^{\pi}\boldsymbol{v}(t)dt$$ and  $$\boldsymbol{M}'(t)$$ if:
\begin{align*}
\boldsymbol{v}(t) = \begin{pmatrix} \cos t\\ \sin t\end{pmatrix}
\hspace{1in}
\boldsymbol{M}(t) = \begin{pmatrix} t & 1\\ t & \sin t\end{pmatrix}.
\end{align*}

Solution. For the integral of the vector:
\begin{align*}
\int_{t=0}^{\pi}\boldsymbol{v}(t)dt
=\begin{pmatrix}\int_{t=0}^{\pi}\cos t\\ \int_{t=0}^{\pi}\sin t\end{pmatrix}
= \begin{pmatrix}0 \\ 2\end{pmatrix}.
\end{align*}

For the derivative of the matrix:
\begin{align*}
\boldsymbol{M}'(t)
=\begin{pmatrix} 1 & 0 \\ 1 & \cos t\end{pmatrix}.
\end{align*}

To see the full video page and find related videos, click the following link.

Video Notes

There are also similar differentiation rules for matrix–vector multiplication. If $$c(t)$$ is a scalar–valued function and $$\boldsymbol{v}(t)=\begin{pmatrix}x_1(t)\\\vdots\\x_n(t)\end{pmatrix}$$ is a vector–valued function then the product rule for scalar–valued functions gives:
\begin{align*}
(c(t)\boldsymbol{v}(t))'
= \begin{pmatrix}(c(t)x_1(t))'\\\vdots\c(t)x_n(t))'\end{pmatrix} = \begin{pmatrix}c'(t)x_1(t) + c(t)x_1'(t)\\\vdots\\c'(t)x_n(t) + c(t)x_n'(t)\end{pmatrix} = c'(t)\boldsymbol{v}(t) + c(t)\boldsymbol{v}'(t). \end{align*} If \(M(t)=\begin{pmatrix}\boldsymbol{w}_1(t)&\cdots &\boldsymbol{w}_n(t)\end{pmatrix} is an $$n\times n$$
matrix–valued function and $$\boldsymbol{v}(t)=\begin{pmatrix}x_1(t)\\\vdots\\x_n(t)\end{pmatrix}$$ is an $$\mathbb{R}^n$$ vector–valued
function then we can use this identity to arrive at:
\begin{align*}
(M(t)\boldsymbol{v}(t))'
&=\left(x_1(t) \boldsymbol{w}_1(t) + \cdots +x_n(t)\boldsymbol{w}_n(t)\right)'
\\&=\left(x_1(t)\boldsymbol{w}_1(t)\right)' + \cdots + \left(x_n(t)\boldsymbol{w}_n(t)\right)'
\\&=x_1'(t)\boldsymbol{w}_1(t) + x_1(t)\boldsymbol{w}_1'(t) + \cdots + x_n'(t)\boldsymbol{w}_n(t) + x_n(t)\boldsymbol{w}_n'(t)
\\&= M(t)\boldsymbol{v}'(t) + M'(t) \boldsymbol{v}(t).
\end{align*}
Finally, the product rule for matrix–vextor multiplication can be extended to matrix–matrix multiplication. If $$N(t) = \begin{pmatrix}\boldsymbol{w}_1(t)&\cdots\boldsymbol{w}_n(t)\end{pmatrix}$$ then:
\begin{align*}
(M(t)N(t))'
&=\begin{pmatrix}(M(t)w_1(t))'&\cdots &(M(t)w_n(t))'\end{pmatrix}
\\&=\begin{pmatrix}M'(t)w_1(t) + M(t)w_1'(t)&\cdots&M'(t)w_n(t) + M(t)w_n'(t)\end{pmatrix}
\\&=M'(t)N(t) + M(t)N'(t).
\end{align*}
Similar computations show that if $$M$$ and $$N$$ are $$n\times n$$ matrix–valued function and if $$\boldsymbol{v}$$ is a vector–valued function then:
\begin{align*}
\left((M(t) + N(t))\boldsymbol{v}(t)\right)' = (M(t)\boldsymbol{v}(t))' + (N(t)\boldsymbol{v}(t))'
\end{align*}
and
\begin{align*}
(M(t) + N(t))' = M'(t) + N'(t).
\end{align*}