## Front Matter

## 1Systems of Linear Equations: Algebra

## 2Systems of Linear Equations: Geometry

## 3Linear Transformations and Matrix Algebra

## 4Determinants

## 5Eigenvalues and Eigenvectors

## 6Orthogonality

## Back Matter

Section2.5Linear Independence¶ permalinkObjectivesUnderstand the concept of linear independence.Learn two criteria for linear independence.Understand the relationship between linear independence and pivot columns / free variables.

You are watching: If a is a matrix with more rows than columns, then the columns of a are linearly independent.

*Recipe:* test if a set of vectors is linearly independent / find an equation of linear dependence.*Picture:* whether a set of vectors in R2 or R3 is linearly independent or not.*Vocabulary words:* *linear dependence relation* / *equation of linear dependence*.*Essential vocabulary words:* *linearly independent*, *linearly dependent*.

Sometimes the span of a set of vectors is “smaller” than you expect from the number of vectors, as in the picture below. This means that (at least) one of the vectors is redundant: it can be removed without affecting the span. In the present section, we formalize this idea in the notion of *linear independence*.

Span{v,w}vwSpan{u,v,w}vwu

Figure1Pictures of sets of vectors that are linearly dependent. Note that in each case, one vector is in the span of the others—so it doesn’t make the span bigger.Subsection2.5.1The Definition of Linear Independence

DefinitionA set of vectors {v1,v2,…,vk} is *linearly independent* if the vector equation

x1v1+x2v2+···+xkvk=0

has only the trivial solution x1=x2=···=xk=0. The set {v1,v2,…,vk} is *linearly dependent* otherwise.

In other words, {v1,v2,…,vk} is linearly dependent if there exist numbers x1,x2,…,xk, not all equal to zero, such that

x1v1+x2v2+···+xkvk=0.

This is called a *linear dependence relation* or *equation of linear dependence*.

Note that linear dependence and linear independence are notions that apply to a *collection of vectors*. It does not make sense to say things like “this vector is linearly dependent on these other vectors,” or “this matrix is linearly independent.”

Example(Checking linear dependence)

Example(Checking linear independence)

Example(Vector parametric form)

The above examples lead to the following recipe.

Recipe: Checking linear independence

A set of vectors {v1,v2,…,vk} is linearly independent if and only if the vector equation

x1v1+x2v2+···+xkvk=0

has only the trivial solution, if and only if the matrix equation Ax=0 has only the trivial solution, where A is the matrix with columns v1,v2,…,vk:

A=E|||v1v2···vk|||F.

This is true if and only if A has a pivot position in every column.

Solving the matrix equatiion Ax=0 will either verify that the columns v1,v2,…,vk are linearly independent, or will produce a linear dependence relation by substituting any nonzero values for the free variables.

(Recall that Ax=0 has a nontrivial solution if and only if A has a column without a pivot: see this observation in Section 2.4.)

Suppose that A has more columns than rows. Then A cannot have a pivot in every column (it has at most one pivot per row), so its columns are automatically linearly dependent.

A wide matrix (a matrix with more columns than rows) has linearly dependent columns.

For example, four vectors in R3 are automatically linearly dependent. Note that a tall matrix may or may not have linearly independent columns.

Facts about linear independence

Two vectors are linearly dependent if and only if they are collinear, i.e., one is a scalar multiple of the other.Any set containing the zero vector is linearly dependent.If a subset of {v1,v2,…,vk} is linearly dependent, then {v1,v2,…,vk} is linearly dependent as well.

Proof

If v1=cv2 then v1−cv2=0, so {v1,v2} is linearly dependent. In the other direction, if x1v1+x2v2=0 with x1A=0 (say), then v1=−x2

x1v2.It is easy to produce a linear dependence relation if one vector is the zero vector: for instance, if v1=0 then

1·v1+0·v2+···+0·vk=0.

After reordering, we may suppose that {v1,v2,…,vr} is linearly dependent, with rp. This means that there is an equation of linear dependence

x1v1+x2v2+···+xrvr=0,

with at least one of x1,x2,…,xr nonzero. This is also an equation of linear dependence among {v1,v2,…,vk}, since we can take the coefficients of vr+1,…,vk to all be zero.

With regard to the first fact, note that the zero vector is a multiple of any vector, so it is collinear with any other vector. Hence facts 1 and 2 are consistent with each other.

Subsection2.5.2Criteria for Linear Independence

In this subsection we give two criteria for a set of vectors to be linearly independent. Keep in mind, however, that the actual definition is above.

Theorem

A set of vectors {v1,v2,…,vk} is linearly dependent if and only if one of the vectors is in the span of the other ones.

Any such vector may be removed without affecting the span.

Proof

Suppose, for instance, that v3 is in Span{v1,v2,v4}, so we have an equation like

v3=2v1−1

2v2+6v4.

We can subract v3 from both sides of the equation to get

0=2v1−1

2v2−v3+6v4.

This is a linear dependence relation.

In this case, any linear combination of v1,v2,v3,v4 is already a linear combination of v1,v2,v4:

x1v1+x2v2+x3v3+x4v4=x1v1+x2v2+x3G2v1−1

2v2+6v4H+x4v4=(x1+2×3)v1+Gx2−12x3Hv2+(x4+6)v4.

See more: @ Skai Jackson And Selena Gomez, Selena Con Skai Jackson En #Defycity

Therefore, Span{v1,v2,v3,v4} is contained in Span{v1,v2,v4}. Any linear combination of v1,v2,v4 is also a linear combination of v1,v2,v3,v4 (with the v3-coefficient equal to zero), so Span{v1,v2,v4} is also contained in Span{v1,v2,v3,v4}, and thus they are equal.

In the other direction, if we have a linear dependence relation like

0=2v1−1

2v2+v3−6v4,

then we can move any nonzero term to the left side of the equation and divide by its coefficient:

v1=1

2G12v2−v3+6v4H.

This shows that v1 is in Span{v2,v3,v4}.

We leave it to the reader to generalize this proof for any set of vectors.

Warning

In a linearly dependent set {v1,v2,…,vk}, it is not generally true that *any* vector vj is in the span of the others, only that *at least one* of them is.

For example, the set CA10B,A20B,A01BD is linearly dependent, but A01B is not in the span of the other two vectors. Also see this figure below.

The previous theorem makes precise in what sense a set of linearly dependent vectors is redundant.

Theorem(Increasing Span Criterion)

A set of vectors {v1,v2,…,vk} is linearly independent if and only if, for every j, the vector vj is not in Span{v1,v2,…,vj−1}.

Proof

It is equivalent to show that {v1,v2,…,vk} is linearly dependent if and only if vj is in Span{v1,v2,…,vj−1} for some j. The “if” implication is an immediate consequence of the previous theorem. Suppose then that {v1,v2,…,vk} is linearly dependent. This means that some vj is in the span of the others. Choose the largest such j. We claim that this vj is in Span{v1,v2,…,vj−1}. If not, then

vj=x1v1+x2v2+···+xj−1vj−1+xj+1vj+1+···+xkvk

with not all of xj+1,…,xk equal to zero. Suppose for simplicity that xkA=0. Then we can rearrange:

vk=−1

xkAx1v1+x2v2+···+xj−1vj−1−vj+xj+1vj+1+···+xp−1vp−1B.

This says that vk is in the span of {v1,v2,…,vp−1}, which contradicts our assumption that vj is the last vector in the span of the others.

We can rephrase this as follows:

If you make a set of vectors by adding one vector at a time, and if the span got bigger every time you added a vector, then your set is linearly independent.

Subsection2.5.3Pictures of Linear Independence

A set containg one vector {v} is linearly independent when vA=0, since xv=0 implies x=0.

Span{v}v

A set of two noncollinear vectors {v,w} is linearly independent:

Span{v}Span{w}vw

The set of three vectors {v,w,u} below is linearly dependent:

In the picture below, note that v is in Span{u,w}, and w is in Span{u,v}, so we can remove any of the three vectors without shrinking the span.

Span{v}Span{w}Span{v,w}vwu

Two collinear vectors are always linearly dependent:

Span{v}vw

These three vectors {v,w,u} are linearly dependent: indeed, {v,w} is already linearly dependent, so we can use the third fact.

Span{v}vwu

Interactive: Linear independence of two vectors in R2

Interactive: Linear dependence of three vectors in R2

The two vectors {v,w} below are linearly independent because they are not collinear.

vwSpan{v}Span{w}

The three vectors {v,w,u} below are linearly independent: the span got bigger when we added w, then again when we added u, so we can apply the increasing span criterion.

vwuSpan{v}Span{w}Span{v,w}

The three coplanar vectors {v,w,u} below are linearly dependent:

vwuSpan{v}Span{w}Span{v,w}

Note that three vectors are linearly dependent if and only if they are *coplanar*. Indeed, {v,w,u} is linearly dependent if and only if one vector is in the span of the other two, which is a plane (or a line) (or {0}).

The four vectors {v,w,u,x} below are linearly dependent: they are the columns of a wide matrix. Note however that u is not contained in Span{v,w,x}. See this warning.

vwuxSpan{v}Span{w}Span{v,w}

Figure20The vectors {v,w,u,x} are linearly dependent, but u is not contained in Span{v,w,x}.

Interactive: Linear independence of two vectors in R3

Interactive: Linear independence of three vectors in R3

Subsection2.5.4Linear Dependence and Free VariablesIn light of this important note and this criterion, it is natural to ask which columns of a matrix are redundant, i.e., which we can remove without affecting the column span.

Theorem

Let v1,v2,…,vk be vectors in Rn, and consider the matrix

A=E|||v1v2···vk|||F.

Then we can delete the columns of A *without* pivots (the columns corresponding to the free variables), without changing Span{v1,v2,…,vk}.

The pivot columns are linearly independent, so we cannot delete any more columns without changing the span.

Proof

If the matrix is in reduced row echelon form:

A=E102001300001F

then the column without a pivot is visibly in the span of the pivot columns:

E230F=2E100F+3E010F+0E001F,

and the pivot columns are linearly independent:

E000F=x1E100F+x2E010F+x4E001F=Ex1x2x4F=⇒x1=x2=x4=0.

If the matrix is not in reduced row echelon form, then we row reduce:

A=E1723324160−1−2−84FRREF−−→E102001300001F.

The following two vector equations have the same solution set, as they come from row-equivalent matrices:

x1E12−1F+x2E74−2F+x3E2316−8F+x4E304F=0x1E100F+x2E010F+x3E230F+x4E001F=0.

We conclude that

E2316−8F=2E12−1F+3E74−2F+0E304F

and that

x1E12−1F+x2E74−2F+x4E304F=0

has only the trivial solution.

Note that it is necessary to row reduce A to find which are its pivot columns. However, the span of the columns of the row reduced matrix is generally *not* equal to the span of the columns of A: one must use the pivot columns of the *original* matrix. See theorem in Section 2.7 for a restatement of the above theorem.

Example

Pivot Columns and Dimension

Let d be the number of pivot columns in the matrix

A=E|||v1v2···vk|||F.

If d=1 then Span{v1,v2,…,vk} is a line.If d=2 then Span{v1,v2,…,vk} is a plane.If d=3 then Span{v1,v2,…,vk} is a 3-space.Et cetera.

See more: Novio De Ester Exposito – Ester Expósito Y Alejandro Speitzer Han Roto

The number d is called the dimension. We discussed this notion in this important note in Section 2.4 and this important note in Section 2.4. We will define this concept rigorously in Section 2.7.