MIT 18.06 - Lecture 3
多角度理解矩阵乘法(Matrix Multiplication)
Assume matrix $C$ is the result of matrix $A$ times matrix $B$ , in other words $C = AB$.
行乘列
$c_{ij}$ equals the dot product of row $i$ of matrix $A$ and column $j$ of matrix $B$. In other words:
$$
c_{ij} = \sum_{k = 1} ^ {n} a_{ik} \cdot b_{kj}
$$
It can be directly calculated in $O(n ^ 3)$.行变换(A 是变换矩阵)
The rows of $C$ are combinations of rows of $B$.列变换(B 是变换矩阵)
The columns of $C$ are combinations of columns of $A$.列乘行
The product of an $m \times 1$ vector with a $1 \times p$ vector is a $m \times p$ matrix.
$$
\left[\begin{array}{c}
2 \ 3 \ 4
\end{array}\right]
\left[\begin{array}{c}
1 \quad 6
\end{array}\right] =
\left[\begin{array}{c}
2 \quad 12 \ 3 \quad 18 \ 4 \quad 24
\end{array}\right]
$$
The columns of this matrix are multiples of the column of $A$.
The rows of this matrix are multiples of the row of $B$.分块矩阵的乘法
$$
\left[\begin{array}{c}
A_1 & A_2 \
A_3 & A_4
\end{array}\right]
\left[\begin{array}{c}
B_1 & B_2 \
B_3 & B_4
\end{array}\right] =
\left[\begin{array}{c}
C_1 & C_2 \
C_3 & C_4
\end{array}\right]
$$
$A_i$ , $B_j$ and $C_k$ are matrixs, $A_i$ are blocks of a bigger matrix and $B_j$ are blocks with the same cutting methods like $A_i$.
What amazing is that it still fits the rules of matrix multiplication!
In other words, $C_1 = A_1B_1 + A_2B_3.$
矩阵的逆(Inverses)
如何理解矩阵的逆?
For understand the inverse matrix easily, we need to use the Elimination Matrix which we have metioned in the lecture 2.
Assume a elimination matrix $E_{21} = \left[\begin{array}{c}
1 & 0 & 0 \
-3 & 1 & 0 \
0 & 0 & 1
\end{array}\right]$, the second row of $E_{21}$ means subtracting $3$ times row $1$ from row $2$.
To “undo” this operation we must add $3$ times row $1$ to row $2$ using the inverse matrix:
$$
E_{21} ^ {-1} = \left[\begin{array}{c}
1 & 0 & 0 \
3 & 1 & 0 \
0 & 0 & 1
\end{array}\right]
$$
It is obviously that $E_{21}^{-1}E_{21} = E_{21}E_{21}^{-1} = \left[\begin{array}{c}
1 & 0 & 0 \
0 & 1 & 0 \
0 & 0 & 1
\end{array}\right] = I$.
可逆矩阵/非奇异矩阵的定义
If $A$ is a square matrix, the most important question you can ask about it is whether it has an inverse $A^{−1}$. If it does, then $A^{−1}A = I = AA^{−1}$ and we say that $A$ is invertible or nonsingular.
For a singular matrix $A$ , which means $A$ doesn’t have its corresponding inverse matrix $A^{-1}$, it can be proved that we can always find some non-zero vector $x$ which $Ax = 0$.
This conclusion can be proved by using reduction to absurdity. Assume $A$ has its inverse matrix $A^{-1}$, then $Ax = 0$ is equivalent to $A^{-1}Ax = A^{-1} \cdot 0$, which means $x = 0$. But the precondition is that $x$ is an non-zero vector. Thus contradiction occurs so that singular matrices don’t have their inverse matrices.
求逆矩阵的方法 — Gauss-Jordan Elimination
We combine the method of Guass Elimination with block matrices to get the Gauss-Jordan Elimination. The main idea is:
$$
E[ \ A \ | \ I \ ] = [ \ I \ | \ E \ ]
$$
$E$ is the product of all of the elimination matrices during the process of transforming $A$ to $I$ using Guass Elimination.
The correctness of this method is obviously due to $EA = I$. According to the defination of inverse martix, $E = A^{-1}$.
MIT 18.06 - Lecture 3