$$ %---- MACROS FOR SETS ----% \newcommand{\znz}[1]{\mathbb{Z} / #1 \mathbb{Z}} \newcommand{\twoheadrightarrowtail}{\mapsto\mathrel{\mspace{-15mu}}\rightarrow} % popular set names \newcommand{\N}{\mathbb{N}} \newcommand{\Z}{\mathbb{Z}} \newcommand{\Q}{\mathbb{Q}} \newcommand{\R}{\mathbb{R}} \newcommand{\C}{\mathbb{C}} \newcommand{\I}{\mathbb{I}} % popular vector space notation \newcommand{\V}{\mathbb{V}} \newcommand{\W}{\mathbb{W}} \newcommand{\B}{\mathbb{B}} \newcommand{\D}{\mathbb{D}} %---- MACROS FOR FUNCTIONS ----% % linear algebra \newcommand{\T}{\mathrm{T}} \renewcommand{\ker}{\mathrm{ker}} \newcommand{\range}{\mathrm{range}} \renewcommand{\span}{\mathrm{span}} \newcommand{\rref}{\mathrm{rref}} \renewcommand{\dim}{\mathrm{dim}} \newcommand{\col}{\mathrm{col}} \newcommand{\nullspace}{\mathrm{null}} \newcommand{\row}{\mathrm{row}} \newcommand{\rank}{\mathrm{rank}} \newcommand{\nullity}{\mathrm{nullity}} \renewcommand{\det}{\mathrm{det}} \newcommand{\proj}{\mathrm{proj}} \renewcommand{\H}{\mathrm{H}} \newcommand{\trace}{\mathrm{trace}} \newcommand{\diag}{\mathrm{diag}} \newcommand{\card}{\mathrm{card}} \newcommand\norm[1]{\left\lVert#1\right\rVert} % differential equations \newcommand{\laplace}[1]{\mathcal{L}\{#1\}} \newcommand{\F}{\mathrm{F}} % misc \newcommand{\sign}{\mathrm{sign}} \newcommand{\softmax}{\mathrm{softmax}} \renewcommand{\th}{\mathrm{th}} \newcommand{\adj}{\mathrm{adj}} \newcommand{\hyp}{\mathrm{hyp}} \renewcommand{\max}{\mathrm{max}} \renewcommand{\min}{\mathrm{min}} \newcommand{\where}{\mathrm{\ where\ }} \newcommand{\abs}[1]{\vert #1 \vert} \newcommand{\bigabs}[1]{\big\vert #1 \big\vert} \newcommand{\biggerabs}[1]{\Bigg\vert #1 \Bigg\vert} \newcommand{\equivalent}{\equiv} \newcommand{\cross}{\times} % statistics \newcommand{\cov}{\mathrm{cov}} \newcommand{\var}{\mathrm{var}} \newcommand{\bias}{\mathrm{bias}} \newcommand{\E}{\mathrm{E}} \newcommand{\prob}{\mathrm{prob}} \newcommand{\unif}{\mathrm{unif}} \newcommand{\invNorm}{\mathrm{invNorm}} \newcommand{\invT}{\mathrm{invT}} % real analysis \renewcommand{\sup}{\mathrm{sup}} \renewcommand{\inf}{\mathrm{inf}} %---- MACROS FOR ALIASES AND REFORMATTING ----% % logic \newcommand{\forevery}{\ \forall\ } \newcommand{\OR}{\lor} \newcommand{\AND}{\land} \newcommand{\then}{\implies} % set theory \newcommand{\impropersubset}{\subseteq} \newcommand{\notimpropersubset}{\nsubseteq} \newcommand{\propersubset}{\subset} \newcommand{\notpropersubset}{\not\subset} \newcommand{\union}{\cup} \newcommand{\Union}[2]{\bigcup\limits_{#1}^{#2}} \newcommand{\intersect}{\cap} \newcommand{\Intersect}[2]{\bigcap\limits_{#1}^{#2}} \newcommand{\intersection}[2]{\bigcap\limits_{#1}^{#2}} \newcommand{\Intersection}[2]{\bigcap\limits_{#1}^{#2}} \newcommand{\closure}{\overline} \newcommand{\compose}{\circ} % linear algebra \newcommand{\subspace}{\le} \newcommand{\angles}[1]{\langle #1 \rangle} \newcommand{\identity}{\mathbb{1}} \newcommand{\orthogonal}{\perp} \renewcommand{\parallel}[1]{#1^{||}} % calculus \newcommand{\integral}[2]{\int\limits_{#1}^{#2}} \newcommand{\limit}[1]{\lim\limits_{#1}} \newcommand{\approaches}{\rightarrow} \renewcommand{\to}{\rightarrow} \newcommand{\convergesto}{\rightarrow} % algebra \newcommand{\summation}[2]{\sum\limits_{#1}^{#2}} \newcommand{\product}[2]{\prod\limits_{#1}^{#2}} \newcommand{\by}{\times} \newcommand{\integral}[2]{\int_{#1}^{#2}} % exists commands \newcommand{\notexist}{\nexists\ } \newcommand{\existsatleastone}{\exists\ } \newcommand{\existsonlyone}{\exists!} \newcommand{\existsunique}{\exists!} \let\oldexists\exists \renewcommand{\exists}{\ \oldexists\ } % statistics \newcommand{\distributed}{\sim} \newcommand{\onetoonecorresp}{\sim} \newcommand{\independent}{\perp\!\!\!\perp} \newcommand{\conditionedon}{\ |\ } \newcommand{\given}{\ |\ } \newcommand{\notg}{\ngtr} \newcommand{\yhat}{\hat{y}} \newcommand{\betahat}{\hat{\beta}} \newcommand{\sigmahat}{\hat{\sigma}} \newcommand{\muhat}{\hat{\mu}} \newcommand{\transmatrix}{\mathrm{P}} \renewcommand{\choose}{\binom} % misc \newcommand{\infinity}{\infty} \renewcommand{\bold}{\textbf} \newcommand{\italics}{\textit} $$

Properties of a Basis

A set of vectors, B, is a basis for a vector space is B is linearly independent and B spans the vector space. Intuitively, a basis for a vector space is any set of vectors that can serve as a coordinate system for that vector space. This coordinate system is analoguous to picking a color using RGB (Red Green Blue) where RGB is the coordinate system and the values for each are the coordinates. For example, the coordinates (255, 255, 255) corresponds to the color white since 255 for Red, 255 for Blue and 255 for Green all make up the color white.

Other facts about a basis:

  • it’s the smallest possible spanning set for the vector space
  • it’s the largest possible linearly independent set in the vector space
  • a basis allows us to identify abstract vector spaces with coordinates
  • the zero vectors can’t be in a basis as it would break linear independence

Bases for the kernel and the range

To find the basis for the kernel of a matrix, put the matrix into row-reduced echelon form then augment the matrix with 0s and solve for the independent variables.

To find the basis for the range of a matrix, put the matrix into row-reduced echelon form and keep the corresponding columns of the original matrix that have a pivot column.

A set bigger than the basis is not linearly independent

If a vector space has a basis containing n vectors, then any set containing more than n vectors can’t be linearly independent. To show this, consider a vector space with a basis that contains n vectors; so the vector space is n-dimensional. Now let’s consider a set of vectors, S, that contains more than n vectors; then each vector in S is a coordinate vector with respect to the basis but because there are more vectors in S than there are in the basis, S must be a linearly dependent set.

If a vector space is n-dimensional, then any linearly independent set with n vectors is a basis

If the dimension of a vector space is n, then any set of n linearly independent vectors in the vector space is a basis. To prove this, let V be a n-dimensional vector space and S be a linearly independent set of n vectors. By way of contradiction, if we assume the span(S) is not equal to V, then there exists a vector x in V such that x is not in the span(S). That is, x can’t be written as a linear combination of vectors in S. So if we add x to S, then S still remains linearly independent. But now S contains n+1 vectors which, by the theorem above, means that S is now linearly dependent; this violates our assumption that S was a linearly independent set of n vectors. Thus, the span(S) = V and since S is linearly independent, it’s also a basis.

Unique representation

Any vector in a vector space, V, can be written as a linear combination of vectors in one of its bases, B, in one and only one way. To show this, assume by way of contradiction that there’s more than one way to represent a vector x. Since the span(B) = V, then x can be written as a linear combination of vectors in B, x = Bc. And since we’re assuming x can be written in another way, let’s say it can be written as x = Bd. If we subtract the two vectors from each other, we get the linear relation: x - x = B(c-d) = 0. Since B is linearly independent, the coefficients in the linear relation must be 0, so c = d. Therefore, there’s really only one way to represent the vector x.

All bases of a finite dimensional vector space contain the same number of vectors

To prove this, consider a basis B with n vectors and another basis D with m vectors. Since B is a basis and D is linearly independent, then we know m < n. Using a similar argument, since D is a basis and B is linearly independent, then n < m. Therefore, m = n. So any two bases of a finite dimensional vector space must have the same number of vectors.