MIT 18.06 - Lecture 9
Linear independence
We say vectors $x_1, x_2, \cdots x_n$ are linearly independent (or just independent):
if $c_1x_1 + c_2x_2 + \cdots + c_nx_n = 0$ only when $c_1, c_2, \cdots , c_n$ are all 0.
Thinking of $Ax$ as a linear combination of the column vectors of A, we see that the column vectors of $A$ are independent exactly when the nullspace of $A$ contains only the zero vector.
In other words, the only solution to $Ax = 0$ is $x = 0$.
If the columns of $A$ are independent then all columns are pivot columns, the rank of $A$ is $n$, and there are no free variables.
If the columns of $A$ are dependent then the rank of $A$ is less than $n$ and there are free variables.
包含零向量的向量组,他们之间必定线性相关。
Spanning a space
Vectors $v_1, v_2, \cdots v_k$ span a space when the space consists of all combinations of those vectors. For example, the column vectors of $A$ span the column space of $A$.
Basis and Dimension
Basis:A basis for a vector space is a sequence of vectors $v_1, v_2, \cdots v_d$ with two properties:
- $v_1, v_2, \cdots v_d$ are independent.
- $v_1, v_2, \cdots v_d$ span the vector space.
In general, $n$ vectors in $R_n$ form a basis if they are the column vectors of an invertible matrix.
Dimension:Given a space, every basis for that space has the same number of vectors, that number is the dimension of the space.
So there are exactly $n$ vectors in every basis for $R_n$.
MIT 18.06 - Lecture 9