Sean Fitzpatrick |
---|
University of Lethbridge |
![]() |
Let \(A = \bbm 3 \amp -1 \amp 2\\-1\amp 4\amp 0\ebm, \vec{v} = \bbm 1\\0\\-3\ebm, \vec{w}=\bbm -2\\1\\4\ebm\text{.}\)
Confirm that \(A(2\vv+\ww) = 2(A\vv)+A\ww\text{.}\)
An \(m\times n\) matrix is a rectangular array of numbers with \(m\) rows and \(n\) columns.
A row vector is a \(1\times k\) matrix \(R = \bbm a_1\amp a_2\amp \cdots \amp a_k\ebm\) consisting of a single row.
A column vector is a \(k\times 1\) matrix \(C = \bbm b_1\\b_2\\ \vdots\\b_k\ebm\) consisting of a single column.
The product \(RC\) is a dot product: \(RC = a_1b_1+a_2b_2+\cdots + a_kb_k\text{.}\)
(\(R\) and \(C\) must have the same number of entries)
To compute \(A\vec{x}\) where \(A\) is \(m\times n\) and \(\vec{x}\) is \(n\times 1\text{,}\) multiply each row in \(A\) by \(\vec{x}\text{.}\)
This works exactly like you'd expect. Given matrices \(A = [a_{ij}], B=[b_{ij}]\) of the same size, define
Aside: what do we mean by “=”?
Let \(A = \bbm 2\amp -3\amp 4\\5\amp 0\amp -1\ebm\text{,}\) \(B = \bbm -7\amp 2\\5\amp -1\ebm\text{,}\) \(C = \bbm 0\amp 5\amp -4\\3\amp -2\amp 1\ebm\text{.}\)
Compute: \(A+C, 3B, 2A-3C\text{.}\)
What can you say about \(A+B\text{,}\) and \(C+A\text{?}\)
Let \(A\text{,}\) \(B\text{,}\) and \(C\) be matrices of the same size. Then:
Given \(A = \bbm 2\amp -3\\0\amp 4\ebm, B = \bbm 1\amp -5\\4\amp -3\ebm, C = \bbm 2\amp 0\\3\amp -1\ebm\text{,}\) compute
With \(A,B,C\) as above, find \(X\) such that
So far we know:
How to multiply a row by a column.
To multiply \(A\) by \(\vec{x}\text{,}\) multiply each row of \(A\) by \(\vec{x}\text{.}\) Arrange results in a column vector.
Calculate \(A\vec{x}\text{,}\) where \(A = \bbm 1\amp -4\\3\amp 2\\5\amp 0\ebm\text{,}\) \(\vec{x}=\bbm 2\\3\ebm\)
We now want to compute \(AB\text{,}\) where \(A\) and \(B\) are matrices.
We still do “row times column”, using rows from \(A\text{,}\) columns from \(B\text{.}\)
Need each row in \(A\) to have same length as each column in \(B\text{:}\) if \(A\) is \(m\times n\text{,}\) \(B\) is \(n\times p\text{.}\)
The product \(AB\) is size \(m\times p\text{.}\) Its \((i,j)\)-entry is the (dot) product of row \(i\) from \(A\) and column \(j\) from \(B\text{.}\)
Note: write \(B = \bbm \vec{b}_1 \amp \vec{b}_2\amp \cdots \amp \vec{b}_p\ebm\) where \(\vec{b}_j\) is column \(j\) of \(B\text{.}\) Then
Compute, if possible:
\(I_n\) denotes the \(n\times n\) identity matrix. The diagonal entries of \(I_n\) (when \(i=j\)) all equal \(1\text{;}\) all other entries are \(0\text{.}\)
Assuming each product below is defined,
\(A(BC)=(AB)C\)
\(A(B+C)=AB+AC\)
\((A+B)C = AC+BC\)
\(A(kB)=(kA)B = k(AB)\) for any scalar \(k\)
\(I_mA = A\) and \(AI_n = A\)
For real numbers \(a\) and \(b\text{,}\) if \(a\neq 0\) and \(ax=b\text{,}\) we know \(x=\frac{b}{a}\text{.}\)
If \(A\) is a matrix, \(\vec{b}\) is a vector, and \(A\vec{x}=\vec{b}\text{,}\) can we similarly solve for \(\vec{x}\text{?}\)
Short answer: no. Longer answer: sometimes. Sort of.
A square (\(n\times n\)) matrix \(A\) is invertible if \(AB=I_n=BA\) for some \(B\text{.}\)
We call \(B\) the inverse of \(A\text{,}\) and write \(B=A^{-1}\text{.}\)
For \(A = \bbm 2\amp -1\\-1\amp 1\ebm\text{,}\) \(D = \bbm 1\amp 1\\1\amp 2\ebm\text{,}\) we saw that \(AD = I_2\) and \(DA=I_2\text{.}\) So \(D=A^{-1}\text{.}\)
Suppose \(A\bbm x\\y\ebm = \bbm 7\\-5\ebm\text{.}\) How can we use \(D\) to solve for \(\vec{x}=\bbm x\\y\ebm\text{?}\)