# Why Does a Vector Space Have a Basis (Module Theory)

## Module and vector space

First we recall some backgrounds. Suppose \(A\) is a ring with multiplicative identity \(1_A\). A **left module** of \(A\) is an additive abelian group \((M,+)\), together with an ring operation \(A \times M \to M\) such that \[
\begin{aligned}
(a+b)x &= ax+bx \\
a(x+y) &= ax+ay \\
a(bx) &= (ab)x \\
1_Ax &= x
\end{aligned}
\] for \(x,y \in M\) and \(a,b \in A\). As a corollary, we see \((0_A+0_A)x=0_Ax=0_Ax+0_Ax\), which shows \(0_Ax=0_M\) for all \(x \in M\). On the other hand, \(a(x-x)=0_M\) which implies \(a(-x)=-(ax)\). We can also define right \(A\)-modules but we are not discussing them here.

Let \(S\) be a subset of \(M\). We say \(S\) is a **basis** of \(M\) if \(S\) generates \(M\) and \(S\) is linearly independent. That is, for all \(m \in M\), we can pick \(s_1,\cdots,s_n \in S\) and \(a_1,\cdots,a_n \in A\) such that \[
m = a_1s_1+a_2s_2+\cdots+a_ns_n,
\] and, for any \(s_1,\cdots,s_n \in S\), we have \[
a_1s_1+a_2s_2+\cdots+a_ns_n=0_M \implies a_1=a_2=\cdots=a_n=0_A.
\] Note this also shows that \(0_M\notin S\) (what happens if \(0_M \in S\)?). We say \(M\) is **free** if it has a basis. The case when \(M\) or \(A\) is trivial is excluded.

If \(A\) is a field, then \(M\) is called a **vector space**, which has no difference from the one we learn in linear algebra and functional analysis. Mathematicians in functional analysis may be interested in the cardinality of a vector space, for example, when a vector space is of finite dimension, or when the basis is countable. But the basis does not come from nowhere. In fact we can prove that vector spaces have basis, but modules are not so lucky. \(\def\mb{\mathbb}\)

## Examples of non-free modules

First of all let's consider the cyclic group \(\mb{Z}/n\mb{Z}\) for \(n \geq 2\). If we define \[
\begin{aligned}
\mb{Z} \times \mb{Z}/n\mb{Z} &\to \mb{Z}/n\mb{Z} \\
(m,k+n\mb{Z}) &\mapsto mk+n\mb{Z}
\end{aligned}
\] which is actually \(m\) copies of an element, then we get a module, which will be denoted by \(M\). For any \(x=k+n\mb{Z} \in M\), we see \(nk+n\mb{Z}=0_M\). Therefore for **any** subset \(S \subset M\), if \(x_1,\cdots,x_k \in M\), we have \[
nx_1+nx_2+\cdots+nx_k = 0_M,
\] which gives the fact that \(M\) has no basis. In fact this can be generalized further. If \(A\) is a ring but not a field, let \(I\) be a nontrivial proper ideal, then \(A/I\) is a module that has no basis.

Following \(\mb{Z}/n\mb{Z}\) we also have another example on finite order. Indeed, **any finite abelian group is not free as a module over \(\mb{Z}\).** More generally,

Let \(G\) be a abelian group, and \(G_{tor}\) be its torsion subgroup. If \(G_{tor}\) is non-trival, then \(G\) cannot be a free module over \(\mb{Z}\).

Next we shall take a look at infinite rings. Let \(F[X]\) be the polynomial ring over a field \(F\) and \(F'[X]\) be the polynomial sub-ring that have coefficient of \(X\) equal to \(0\). Then \(F[X]\) is a \(F'[X]\)-module. However it is not free.

Suppose we have a basis \(S\) of \(F[X]\), then we claim that \(|S|>1\). If \(|S|=1\), say \(P \in S\), then \(P\) cannot generate \(F[X]\) since if \(P\) is constant then we cannot generate a polynomial contains \(X\) with power \(1\); If \(P\) is not constant, then the constant polynomial cannot be generate. Hence \(S\) contains at least two polynomials, say \(P_1 \neq 0\) and \(P_2 \neq 0\). However, note \(-X^2P_1 \in F'[X]\) and \(X^2P_2 \in F'[X]\), which gives \[ (X^2P_2)P_1-(X^2P_1)P_2=0. \] Hence \(S\) cannot be a basis.

## Why does a vector space have a basis

I hope those examples have convinced you that basis is not a universal thing. We are going to prove that every vector space has a basis. More precisely,

Let \(V\) be a nontrivial vector space over a field \(K\). Let \(\Gamma\) be a set of generators of \(V\) over \(K\) and \(S \subset \Gamma\) is a subset which is linearly independent, then there exists a basis of \(V\) such that \(S \subset B \subset \Gamma\).

Note we can always find such \(\Gamma\) and \(S\). For the extreme condition, we can pick \(\Gamma=V\) and \(S\) be a set containing any single non-zero element of \(V\). Note this also gives that we can generate a basis by expanding any linearly independent set. The proof relies on a fact that every non-zero element in a field is invertible, and also, Zorn's lemma. In fact, axiom of choice is equivalent to the statement that every vector has a set of basis. The converse can be found here. \(\def\mfk{\mathfrak}\)

*Proof.* Define \[
\mfk{T} =\{T \subset \Gamma:S \subset T, \text{ $T$ is linearly independent}\}.
\] Then \(\mfk{T}\) is not empty since it contains \(S\). If \(T_1 \subset T_2 \subset \cdots\) is a totally ordered chain in \(\mfk{T}\), then \(T=\bigcup_{i=1}^{\infty}T_i\) is again linearly independent and contains \(S\). To show that \(T\) is linearly independent, note that if \(x_1,x_2,\cdots,x_n \in T\), we can find some \(k_1,\cdots,k_n\) such that \(x_i \in T_{k_i}\) for \(i=1,2,\cdots,n\). If we pick \(k = \max(k_1,\cdots,k_n)\), then \[
x_1,x_2,\cdots,x_n \in \bigcup_{i=1}^{n}T_{k_i}=T_k.
\] But we already know that \(T_k\) is linearly independent, so \(a_1x_1+\cdots+a_nx_n=0_V\) implies \(a_1=\cdots=a_n=0_K\).

By Zorn's lemma, let \(B\) be the maximal element of \(\mfk{T}\), then \(B\) is also linearly independent since it is an element of \(\mfk{T}\). Next we show that \(B\) generates \(V\). Suppose not, then we can pick some \(x \in \Gamma\) that is not generated by \(B\). Define \(B'=B \cup \{x\}\), we see \(B'\) is linearly independent as well, because if we pick \(y_1,y_2,\cdots,y_n \in B\), and if \[ \sum_{k=1}^{n}a_ky_k+bx=0_V, \] then if \(b \neq 0\) we have \[ x = -\sum_{k=1}^{n}b^{-1}a_ky_k \in B, \] contradicting the assumption that \(x\) is not generated by \(B\). Hence \(b=0_K\). However, we have proved that \(B'\) is a linearly independent set containing \(B\) and contained in \(S\), contradicting the maximality of \(B\) in \(\mfk{T}\). Hence \(B\) generates \(V\). \(\square\)