Previously we have seen some invertibility criteria for linear maps. Thanks to the theorem Linear map determined by the image of a basis this also provides invertibility criteria for matrices. We will add another criterium, in terms of the rank.
Let #n# be a natural number. For each #(n\times n)#-matrix #A# the following statements are equivalent:
- The rank of #A# is #n#
- The rows of #A# are independent
- The columns of #A# are independent
- The reduced echelon form of #A# is the identity matrix
- The matrix #A# is invertible
#1\Rightarrow 2# and #1\Rightarrow 3#: If the rank of #A# is equal to #n#, then both the #n# rows as well as the #n# columns span an #n#-dimensional space, and this means they are independent.
#2\Rightarrow 1# and #3\Rightarrow 1#: If the #n# rows (or columns) are independent, then the rows (or columns) span an #n#-dimensional space, and the rank of the matrix is equal to #n#. Moreover, the column space (or row space) is then also #n#-dimensional and hence the columns (or rows) are independent.
Hence, statements 1, 2 and 3 are equivalent.
#1\Leftrightarrow 4#: The matrix #A# and the reduced echelon form of #A# have the same rank. From the structure of the reduced echelon form, we can see right away that the matrix has rank #n# if and only if it is an identity matrix.
#1\Leftrightarrow 5#: By the theorem Linear map determined by the image of a basis, #L_A# is invertible if and only if #\im{L_A} =\mathbb{R}^n#, meaning, if and only if the rank of #A# is equal to #n#.
The given list can be supplemented as follows:
- #A# is injective
- #A# is surjective
- #\ker{A}=\{\vec{0}\}#
- #\im{A}=\mathbb{R}^n#
- #\det(A)\neq0#
According to theorem Invertibility with the same dimensions for domain and codomain statements 6 and 7 are equivalent to statement 5.
According to the Criteria for injectivity en surjectivity statements 6 and 7 are equivalent to statements 8 respectively 9. Besides, statements 8 and 9 are equivalent to \[\dim{\ker{A}}=0\qquad\text{respectively}\qquad\dim{\im{A}}=n\] in accordance with the rank-nullity theorem. According to Rank is dimension column space, the last equation is equivalent to #\text{rank}(A)=n#, which confirms the equivalence of statements 1 and 9 directly.
The equivalence of statements 10 and 5 will be proven later in Invertibility in terms of determinant. In that statement we will also prove \[\det\left(A^{-1}\right)=\frac{1}{\det(A)}\] on the condition that #A^{-1}# exists. Together with statement 10 it then follows that every statement in the list for #A# is equivalent to the corresponding statement for #A^{-1}#, if the inverse exists.
Later in Determinant of transpose and product we prove #\det(A)=\det(A^\top)#. Together with statement 10 it then follows that every statement in the list for #A# is equivalent to the corresponding statement for #A^\top#.
According to the dependence criterion statements 2 and 3 are equivalent to the statements that there does not exists a non-trivial relation between the rows respectively columns of #A#.
A #(1\times1)#-matrix #A=\matrix{a}# satisfies the conditions if and only if #a\ne0#.
Let #A=\matrix{a&b\\ c & d}#. The columns of #A# are linearly independent if and only if there exists a non trivial linear combination resulting in the zero vector, hence if there are scalars #\lambda# and #\mu#, not both equal to #0#, such that
\[\lambda\rv{a,c}+\mu\rv{b,d} = \rv{0,0}\]
#\lambda \cdot a=- \mu\cdot b# and #\lambda \cdot c=- \mu\cdot d#
#\lambda \cdot a \cdot d = -\mu\cdot b \cdot d = \lambda\cdot c\cdot b#
so #\lambda=0# or #a\cdot d-b\cdot c = 0#. Just as: #\mu=0# or #a\cdot d-b\cdot c = 0#.
Conclusion: The columns of #A# are linearly independent if and only if #a\cdot d-b\cdot c \ne 0#. This condition is the same for #A^\top# as for #A#. This explains that the rank of #A# is equal to #2# if and only if the rank of #A^\top# would be equal to #2#. This shows once again that both statement 2 and 3 are equivalent with statement 1.
Statement 5 can even be illustrated by means of the following, easy to calculate, formula
\[\text{For }B = \matrix{d&-b\\ -c&a}\text{ we have } A\,B = (a\cdot d -b\cdot c)\cdot I_2\]
If #a\cdot d -b\cdot c\ne0#, then #A^{-1} =\frac{1}{a\cdot d -b\cdot c}\cdot B# is the inverse of #A#. Now assume that #a\cdot d -b\cdot c = 0#. If #A# is the zero matrix, then #A# is not invertible. If #A# is unequal to the zero matrix, then some column of #B# is unequal to #\vec{0}# and belongs to the kernel of #A#, so #A# is not invertible. We conclude that #A# is invertible if and only if #a\cdot d -b\cdot c\ne0#.
The expression #a\cdot d -b\cdot c# is known as the determinant of #A#, which we will see later.
Is the following matrix invertible?
\[
\matrix{
1 &1 & -5 \\
1 &17 &-13 \\
-4 &-36 &36}
\]
No
We will approach this just like inverting a matrix: we augment the matrix with an identity matrix and apply
Gaussian elimination:
\[
\begin{aligned}
\left(
\begin{array}{ccc|ccc}
1&1&-5&1 &0 &0 \\
1&17&-13&0 &1 &0\\
-4&-36&36&0 &0 &1\\
\end{array}
\right)&\sim
\left(
\begin{array}{ccc|ccc}
1 &1&-5&1 & 0 & 0\\
0 &16&-8&-1& 1 & 0 \\
0 &-32&16&4& 0 & 1 \\
\end{array}
\right)
&{\color{blue}{\begin{array}{ccc}
\mbox{}\\
R_2 \to R_2 - R_1\\
R_3 \to R_3 +4R_1
\end{array}}}\\
&\sim
\left(
\begin{array}{ccc|ccc}
1 &1&-5&1 & 0 & 0\\
0 &16&-8&-1& 1 & 0 \\
0 &0 &0 &2&2& 1 \\
\end{array}
\right)
&{\color{blue}{\begin{array}{ccc}
\mbox{}\\
\mbox{}\\
R_3 \to R_3 +2R_2
\end{array}}}
\end{aligned}
\]A null row appeared. This means that the matrix is not invertible. Hence, the answer is: No.
We have only reduced with rows, but in order to determine if #A# has an inverse, we could have also reduced #A# with columns. Furthermore, we could have omitted the matrices behind the vertical bar.