Linear Combination, Span, and Linear Independence

Linear Combinations

A linear combination of the vectors is a vector of the form for . Linear combinations of a bunch of vectors is just scaling each vector, and then adding them together.

Next, the span of a set of vectors is the set of all linear combinations of those vectors. Specifically, the span of a set where is denoted as

This is clearly a subset of , since is closed under linear combination, but it is also a subspace of . This means that the span of a set of vectors, that live in a vector space, produces a subspace of that vector space!

To show this, let’s let . Then, . To show that this is closed under linear combination, consider two vectors . Then, we can write and . We need to show that for , . We have

so we have shown that is indeed closed under linear combination.

Since we know that is closed under linear combination and it is a subset of , is a subspace of .

Defintion: let such that . This means that every vector can be generated by a linear combination of some elements in . We say that generates . Eventually, we will be interested in sets that generate vector spaces in a minimal way, leading us to bases.

Linear Independence

Definition: A set is linearly dependent if there exists a nontrivial linear combination of the vectors producing : , and at least one . Otherwise, if the only linear combination of is the trivial one, then is linearly independent.

Proposition: is linearly independent, where is the one-hot vector with in the ith column, and everywhere else. We have:

only if all , so the set is linearly independent.

Linear independence depends on the field that our vector space is defined over. For example, consider the vector space of complex numbers over the field of complex numbers. Then, is linearly dependent, since , butin the complex vector space over the real numbers, is linearly independent since we can only scale by real numbers, so there is no nontrivial linear combination of .

Theorem - if is linearly independent, then is linearly dependent iff . The “iff” means that this is the only time when can be linearly dependent, and means that both of the following statements are true:

  1. Forward implication: if is linearly independent, and is linearly dependent, then .
  2. Backward implication: if is linearly independent and then is linearly dependent.

Proof: Exercise.

Basis - The building blocks of Vector Spaces

Definition: A basis of a vector space is a linearly independent set that spans the vector space. It is therefore the “minimal” set of vectors that span the space, and in a sense contains no “redundant” information - though we will formalize these notions later.

Ex: the list of vectors is a (standard) basis for . The list of vectors is a basis for (the standard basis). This means that bases can be finite-dimensional.

Ex: is a basis for the complex vector space over the complex numbers, but over the reals, is the basis.

Theorem: Let be finite and . Then, there exists a basis . Proof: let . Idea: we will build up by looking at elements in and adding certain elements to the basis.

Algorithm for constructing : Look at each in . If is linearly independent, then let

This construction results in being a basis. First, by construction, is linearly independent, so all we have to show is that . Since each , since vector spaces are closed unear linear combination. If we can show that then we have , since in general .

Since this is the same as showing that . This is the same as showing that - this is because is a subspace, and subspaces are closed under linear combination, and is just taking linear combinations of vectors in .

To show that , we can take any . There are two cases: or not. If , then clearly , otherwise, . By construction, this means that is linearly dependent, and by the previous theorem, this is only possible if , so either way, .

Since we have . Thus we have and is a basis.

Theorem: Let . is a basis for iff each can be written as a unique linear combination of elements in .

Proof: Forward implication states that if is a basis then any can be written as for uniques scalars . This means that if and then . To show this, we can subtract the two equalities, and get that , but since is linearly independent, since it is a basis, the only linear combination that gives is the trivial one, so we must have .

Backward implication: states that if any can be written as unique combination of scalars , then is a basis. First, we note that clearly generates . Next, since all can be written as a unique linear combination, then this is also true for . But a linear combination for is , so there is no other linear combination giving besides the trivial one, implying that is also linearly independent, and therefore a basis.