The last section was devoted to what might be called the arithmetic of matrices. We learned the basic operations of adding, multiplying, scaling, and transposing matrices. In this section we tackle the algebra of matrices. We will investigate the properties enjoyed (and not enjoyed) by our matrix operations, and will show how to use these operations to solve matrix equations.
As you learn about matrix algebra, always keep in mind your old friend, real number algebra. For the most part these two algebraic systems closely resemble one another, as Theorem 2.2.1 below makes clear. However, there are two crucial points where they differ (see Theorem 2.2.8): two important properties of real number algebra that do not hold for matrices. The consequences of these two simple aberrations are far-reaching and imbue matrix algebra with a fascinating richness in comparison to real number algebra.
Theorem2.2.1.Properties of matrix addition, multiplication and scalar multiplication.
The following properties hold for all matrices \(A, B, C\) and scalars \(a,
b, c\in \R\) for which the given expression makes sense.
Addition commutative law.
\(\displaystyle A+B=B+A\)
Addition associative law.
\(\displaystyle A+(B+C)=(A+B)+C\)
Multiplication associative law.
\(\displaystyle A(BC)=(AB)C\)
Left-distributive law.
\(\displaystyle A(B+C)=AB+AC\)
Right-distributive law.
\(\displaystyle (B+C)A=BA+CA\)
Scaling distributive law.
\(\displaystyle a(B+C)=aB+aC\)
Another scaling distributive law.
\(\displaystyle (a+b)C=aC+bC\)
Scaling associative law.
\(\displaystyle a(bC)=(ab)C\)
Scaling commutative law.
\(a(BC)=(aB)C=B(aC)\text{.}\)
How does one actually prove one of these properties? These are all matrix equalities of the form \(X=Y\text{,}\) so according to the matrix equality definition we must show (1) that the matrices \(X\) and \(Y\) have the same dimension, and (2) that \((X)_{ij}=(Y)_{ij}\) for all \((i,j)\text{.}\) The proof below illustrates this technique for the multiplication associative law of Theorem 2.2.1.
We prove only the multiplication associative law. Let \(A=[a_{ij}]_{m\times r}\text{,}\)\(B=[b_{ij}]_{r\times s}\text{,}\)\(C=[c_{ij}]_{s\times n}\text{.}\) To show
This proves that all entries of the two matrices are equal, and hence \(A(BC)=(AB)C\text{.}\)
Like real number algebra, we can identify some special matrices that act as additive identities and multiplciative identities; and every matrix has an additive inverse. What we mean here is spelled out in detail in Theorem 2.2.4.
Definition2.2.2.Additive inverse of a matrix.
Given an \(m\times n\) matrix \(A=[a_{ij}]\text{,}\) its additive inverse \(-A\) is defined as
When the size \(n\) of the identity matrix is not important, we will often denote it simply as \(I\text{.}\)
Theorem2.2.4.Additive identities, additive inverses, and multiplicative identities.
Additive identities.
The \(m\times n\) zero matrix \(\boldzero_{m\times n}\) is an additive identity for \(m\times n\) matrices in the following sense: for any \(m\times n\) matrix \(A\) we have
The \(n\times n\) identity matrix is a multiplicative identity for \(n\times n\) matrices in the following sense: for any \(n\times n\) matrix \(A\) we have
As simple as this claim might seem, remember that we are dealing with a completely new algebraic system here. We will prove both implications of “if and only if” statement separately.
This direction is obvious: if \(B\) and \(C\) are equal matrices, then they remain equal when we add \(A\) to each of them.
Remark2.2.6.
The algebraic importance of Corollary 2.2.5 is that we can perform additive cancellation in matrix equations just as we do in real number algebra. For example, we can solve the matrix equation \(A+B=3A\) for \(B\) as follows:
Though we can perform additive cancellation in matrix algebra, we can not always perform multiplicative cancellation. For example, consider the matrices
Check for yourself that \(AB=AC\text{,}\) and yet \(B\ne C\text{.}\) In other words, we cannot always “cancel” \(A\) from the matrix equation \(AB=AC\text{.}\)
The example in our warning above is but one instance of the general failure of the principle of multiplicative cancellation in matrix algebra. This in turn is a consequence of the following theorem, which identifies the two crucial places where matrix algebra differs significantly from real number algebra.
Theorem2.2.8.Matrix algebra abnormalities.
Matrix multiplication is not commutative.
For two \(n\times n\) matrices \(A\) and \(B\text{,}\) we do not necessarily have \(AB=BA\text{.}\)
Products of nonzero matrices may be equal to zero.
If the product of two matrices is the zero matrix, we cannot conclude that one of matrices is the zero matrix. In logical notation:
This is a good place to point out that to prove an identity does not hold, it suffices to provide a single counterexample to that effect. We do so for each failed identity of Theorem 2.2.8 in turn. There is no significance to the particular counterexamples chosen here, and indeed there are infinitely many counterexamples to choose from in both cases.
Mark well this important abnormality of matrix algebra. Confronted with a real number equation of the form \(ab=ac\text{,}\) we have a deeply ingrained impulse to declare that either \(a=0\) or \(b=c\text{.}\) (If we’re sloppy we may forget about that first possibility.) The corresponding maneuver for the matrix equation \(AB=AC\) is simply not available to us, unless we know something more about \(A\text{.}\)
We end our foray into matrix algebra with some properties articulating how matrix transposition interacts with matrix addition, multiplication and scalar multiplication.
Theorem2.2.11.Properties of matrix transposition.
The following properties hold for all matrices \(A, B, C\) and scalars \(c\in \R\) for which the given expression makes sense.
We prove only the first statement. First observe that if \(A\) is \(m\times n\text{,}\) then so is \(B\) and \(A+B\text{.}\) Then \((A+B)^T\) is \(n\times m\) by Definition 2.1.29. Similarly, we see that \(A^T+B^T\) is \(n\times m\text{.}\)
Next, given any \((i,j)\) with \(1\leq i\leq n\text{,}\)\(1\leq j\leq m\text{,}\) we have
Since the \(ij\)-entries of both matrices are equal for each \((i,j)\text{,}\) it follows that \((A+B)^T=A^T+B^T\text{.}\)
Video examples: proving matrix equalities.
ExercisesExercises
WeBWork Exercises
1.
Determine which of the following statements are true and which are false.
If \(\displaystyle A\) and \(\displaystyle B\) are matrices with sizes such that \(\displaystyle AB\) is square, then \(\displaystyle A\) and \(\displaystyle B\) must be square.
If \(\displaystyle A\) and \(\displaystyle B\) are square matrices of the same size, then \(\displaystyle (AB)^T = B^TA^T\text{.}\)
If \(\displaystyle A \in \mathcal{M}_{n,m}\) and \(\displaystyle \mathbf{b} \in \mathbb{R}^m\) is a column vector, then \(\displaystyle A\mathbf{b}\) is a linear combination of the columns of \(\displaystyle A\text{.}\)
If \(\displaystyle A\) is a square matrix for which \(\displaystyle A^2 = I\text{,}\) then \(\displaystyle A = I\) or \(\displaystyle A = -I\text{.}\)
If \(\displaystyle A\) and \(\displaystyle B\) are matrices such that \(\displaystyle AB = O\) and \(\displaystyle A \neq O\text{,}\) then \(\displaystyle B = O\text{.}\)
2.
Let \(A\) be a 5 by 8 matrix. Then \(-A\) is a by matrix, and \(A^T\) is a by matrix.
We need both \(A\) and \(B\) to be \(m\times n \) for the expression to make sense. It is easy to see that \(A+B\) and \(B+A\) are both \(m\times n\) matrices. We must show \((A+B)_{ij}=(B+A)_{ij}\) for all \(1\leq i\leq m\text{,}\)\(1\leq j\leq n\text{.}\) We have
Let \(A\) an \(n\times n\) matrix. We define its square \(A^2\) as \(A^2=AA\text{.}\)
In real number algebra we know that \(a^2=0\implies a=0\text{.}\) By contrast, show that there are infinitely many \(2\times 2\) matrices \(A\) satisfying \(A^2=\boldzero_{2\times 2}\text{.}\)
Optional: can you describe in a parametric manner the set of all matrices \(A\) satisfying \(A^2=\boldzero_{2\times 2}\text{?}\)
In real number algebra we know that \(a^2=a\implies a=0 \text{ or } a=1\text{.}\) By contrast, show that there are infinitely many \(2\times 2\) matrices \(A\) satisfying \(A^2=A\text{.}\)
In real number algebra we have the identity \((x+y)^2=x^2+2xy+y^2\text{.}\) Show that two \(n\times n\) matrices \(A\text{,}\)\(B\) satisfy
For (a) set \(A=\abcdmatrix{a}{b}{c}{d}\text{,}\) compute \(A^2\text{,}\) set this matrix equal to \(\boldzero_{2\times 2}\text{,}\) and try and find some solutions to the corresponding (nonlinear) system of four equations in the unknowns \(a,b,c,d\text{.}\)
Similar hint for (b), only now set \(A^2=A\text{.}\)