# Characters in Analysis and Algebra

Let $M$ be a monoid and $K$ be a field, then by a character of $G$ in $K$ we mean a monoid-homomorphism

By trivial character we mean a character such that $\chi(M)=\{1\}$. We are particularly interested in the linear independence of characters. Functions $f_i:M \to K$ are called linearly independent over $K$ if whenever

with all $a_i \in K$, we have $a_i=0$ for all $i$. $\def\Tr\operatorname{Tr}$

## Character in Fourier Analysis

In Fourier analysis we are always interested by functions like $f(x)=e^{-inx}$ or $g(x)= e^{-ixt}$, corresponding to Fourier series (integration on $\mathbb{R}/2\pi\mathbb{Z}$) and Fourier transform. Later mathematicians realised that everything can be set in a locally compact abelian (LCA) group. For this reason we need to generalise these functions, and the bounded ones coincide with our definition of characters.

Let $G$ be a LCA group, then $\gamma:G \to \mathbb{C}$ is called a character if $|\gamma(x)|=1$ for all $x \in G$ and

Note since $G$ is automatically a monoid, this coincide with our ordinary definition of character. The set of continuous characters form a group $\Gamma$, which is called the dual group of $G$.

If $G=\mathbb{R}$, solving the equation $\gamma(x+y)=\gamma(x)\gamma(y)$ in whatever way he or she likes we obtain $\gamma(x)=e^{Ax}$ for some $A \in \mathbb{C}$. But $|e^{Ax}| \equiv 1$ (or merely being bounded) forces $A$ to be purely imaginary, say $A=it$, then we have $\gamma(x)=e^{itx}$. Hence the dual group of $\mathbb{R}$ can be determined by (the speed of) rotation on the unit circle.

With this we have our generalised version of Fourier transform. Let $G$ be a LCA group, $f \in L^1(G)$, then the Fourier transform is given by

One can intuitively verify that $\hat{f}$ is exactly the Gelfand transform of $f$, the step of which will be sketched below. On one hand, one can indeed verify that $f \mapsto \hat{f}(\gamma)$ is indeed a Banach algebra homomorphism $L^1(G) \to \mathbb{C}$, for all $\gamma \in \Gamma$. This is a plain application of Fubini’s theorem. On the other hand, let $h:L^1(G) \to \mathbb{C}$ be any non-trivial Banach algebra homomorphism. One can investigate that $| h | =1$ and hence $h$ is a bounded linear functional. By Riesz’s representation theorem, there is some $\phi \in L^\infty(G)$ with $| \phi|_\infty = 1$ such that

We can indeed assume that $\phi$ is continuous. With $h$ being algebra homomorphism, we can see

We know that $|\phi(x)| \le 1$ but $\phi(-x)=\phi(x)^{-1}$ forces $|\phi(x)|=1$. The proof is done after some routine verification of uniqueness.

Indeed, with this identification, we can also identify $\Gamma$ as the maximal ideal space of $L^1(G)$, which results in the following interesting characterisation.

If $G$ is discrete, then $\Gamma$ is compact; if $G$ is compact, then $\Gamma$ is discrete.

Proof. If $G$ is discrete, then $L^1(G)$ has a unit. The maximal ideal space, which can be identified as $\Gamma$, is a compact Hausdorff space.

If $G$ is compact, then its Haar measure can be normalised so that $m(G)=1$. We prove that the singleton containing the unit alone is an open set. Let $\gamma \in \Gamma$ be a character $\ne 1$, then there exists some $x_0$ such that $\gamma(x_0) \ne 1$. As a result,

and hence $\int_G\gamma(x)dx=0$. If $\gamma=1$ then $\int_G \gamma(x)=1$.

Besides, the compactness of $G$ implies the constant function $f \equiv 1$ is in $L^1(G)$. As a result, $\hat{f}(1)=1$ but $\hat{f}(\gamma)=0$ whenever $\gamma \ne 1$. But $\hat{f}$ is continuous, $\{\gamma:{f}(\gamma) \ne 0\}=\{1\}$ is open. $\square$

## Linear Independence of Characters

If characters of $G$ are linear independent, then they are pairwise distinct, but what about the converse? Dedekind answered this question affirmatively. But his approach is rather complicated: it needed determinant. However, Artin found a neat way to do it:

Theorem (Dedekind-Artin) Let $M$ be a monoid and $K$ a field. Let $\chi_1,\dots,\chi_n$ be distinct characters of $G$ in $K$. Then they are linearly independent over $K$.

Proof. Suppose this is false. Let $N$ be the smallest integer that

but not all $a_i$ are $0$, for distinct $\chi_i$. Since $\chi_1 \ne \chi_2$, there is some $z \in M$ such that $\chi_1(z) \ne \chi_2(z)$. Yet still we have

Since $\chi_i$ are characters, for all $x \in M$ we have

We now have a linear system

If we perform Gaussian elimination once, we see

But this is to say

Note by assumption $\frac{\chi_2(z)}{\chi_1(z)}-1 \ne 0$ and therefore we found $N-1$ distinct and linearly independent characters, contradicting our assumption. $\square$

As an application, we consider an $n$-variable equation:

Let $\alpha_1,\cdots,\alpha_n$ be distinct non-zero elements of a field $K$. If $a_1,\cdots,a_n$ are elements of $K$ such that for all integers $v \ge 0$ we have

then $a_i=0$ for all $i$.

Proof. Consider $n$ distinct characters $\chi_i(v)=\alpha^v$ of $\mathbb{Z}_{\ge 0}$ into $K^\ast$. $\square$

## Hilbert’s Theorem 90

The linear independence of characters gives us a good chance of studying the relation of the field extension and the Galois group.

Hilbert’s Theorem 90 (Modern Version) Let $K/k$ be a Galois extension with Galois group $G$, then $H^1(G,K^\ast)=1$ and $H^1(G,K)=0$. This is to say, the first cohomology group is trivial for both addition and multiplication.

It may look confusing but the classic version is about cyclic extensions ($K/k$ is cyclic if it is Galois and the Galois group is cyclic).

Hilbert’s Theorem 90 (Classic Version, Multiplicative Form) Let $K/k$ be cyclic of degree $n$ with Galois group $G$ generated by $\sigma$. Then

where $1/\sigma{A}$ consists of all elements of the form $\alpha/\sigma(\alpha)$ with $\alpha \in A$, and $N(\beta)$ is the norm of $\beta \in K$ over $k$.

This corresponds to the statement that $H^1(G,K^\ast)=1$. On the other hand,

Hilbert’s Theorem 90 (Classic Version, Additive Form) Let $K/k$ be cyclic of degree $n$ with Galois group $G$ generated by $\sigma$. Then

where $(1-\sigma)A$ consists of all elements of the form $(1-\sigma)(\alpha)$ with $\alpha \in A$, and $\Tr(\beta)$ is the norm of $\beta \in K$ over $k$.

This corresponds to, of course, the statement that $H^1(G,K)=0$. Note this indeed asserts an exact sequence

Before we prove it we recall what is group cohomology. Let $G$ be a group. We consider the category $G$-mod of left $G$-modules. The set of morphisms of two objects $A$ and $B$, for which we write $\operatorname{Hom}_G(A,B)$, consists of all objects of $G$-set maps from $A$ to $B$. The cohomology groups of $G$ with coefficients in $A$ is the right derived functor of $\operatorname{Hom}_G(\mathbb{Z},-)$:

It follows that $H^0(G,A) \cong \operatorname{Hom}_G(Z,A)=A/\langle ga-a:g \in G,a \in A \rangle$. In particular, if $G$ is trivial, then $\operatorname{Hom}_G(\mathbb{Z},-)$ is exact and therefore $H^\ast(G,A)=0$ whenever $\ast \ne 0$. We will see what will happen when $G$ is a Galois group of a Galois extension. If the modern version is beyond your reach, you can refer to the classic version. As a side note, the modern version can also be done using Shapiro’s lemma.

### Proof of the Modern Version

Proof. Note $\alpha:G \to K^\ast$ is an 1-cocyle if and only if $\alpha_{\sigma\tau}=\alpha_\sigma\sigma(\alpha_\tau)$ for all $\sigma,\tau \in G$. By Artin’s lemma, for each 1-cocyle $\alpha$, the following map is nontrivial:

Suppose $\gamma=\Lambda(\theta) \ne 0$. Then

which is to say $\alpha_\tau = \gamma/\tau\gamma$. Replacing $\gamma$ with $\gamma^{-1}$ gives what we want: cocycle coincides with coboundary. So much for the multiplicative form.

For the additive form, take $\theta \in K \setminus \ker Tr$. Given a $1$-cocycle $\alpha$ in the additive group $K$, we put

Since cocycle satisfies $\alpha_{\sigma\tau}=\alpha_\sigma+\sigma\alpha_\tau$, we get

which gives $\alpha_\sigma = \beta-\sigma\beta$. Replacing $\beta$ with $-\beta$ gives what we want. $\square$

### Proof of the Classic Version

Additive form. Pick any $\beta-\sigma\beta$, we see $\Tr(\beta-\sigma\beta)=\sum_{\tau \in G}\tau\beta-\sum_{\tau \in G}\tau\beta=0$.

Conversely, assume $\Tr(\alpha)=0$. By Artin’s lemma, the trace function is not trivial, hence there exists some $\theta \in K$ such that $\Tr(\theta)\ne 0$, then we take

where for convenience we write $\sigma\theta=\theta^\sigma$. Therefore

because other terms are cancelled. $\square$

Multiplicative form. This can be done in a quite similar setting. For any $\alpha=\beta/\sigma\beta$, we have

Conversely, assume $N(\alpha)=1$. By Artin’s lemma, following function is not trivial:

Suppose now $\beta=\Lambda(\theta) \ne 0$. It follows that

and this is exactly what we want. $\square$

## Applications

Consider the extension $\mathbb{Q}(i)/\mathbb{Q}$. The Galois group $G=\{1,\tau\}$ is cyclic and generated by $\tau$ the complex conjugation. Now we pick whatever $N(a+bi)=a^2+b^2=1$ where $a,b \in \mathbb{Q}$, we have some $r=s+ti \in \mathbb{Q}(i)$ such that

If we put $(x,y,z)=(s^2-t^2,2st,s^2+t^2)$, we actually get a Pythagorean triple (if $s,t$ are fractions, we can multiply them with the $\gcd$ of the denominators so they are integers.). Conversely, all Pythagorean triple $(x,y,z)$, we assign it with $\frac{x}{z}+\frac{y}{z}i \in \mathbb{Q}(i)$ then we have an element of norm $1$. Through this we have found all solutions to $x^2+y^2=z^2$. i.e.

Theorem Integers $x,y,z$ satisfy the Diophantine equation $x^2+y^2=z^2$ if and only if $(x,y,z)$ is proportional to $(m^2-n^2,2mn,m^2+n^2)$ for some integers $m,n$.

This can be generalised to all Diophantine equations of the form $x^2+Axy+By^2=Cz^2$ for some nonzero constant $C$ and constant $A,B$ such that the discriminant $A^2-4B$ is square-free. You can find some discussion here.

The additive form is a good friend of “character $p$” things. Artin-Schreier’s theorem is a good example of $p$-to-the-$p$.

Theorem (Artin-Schreier) Let $k$ be a field of character $p$ and $K/k$ an extension of degree $p$. Then there exists $\alpha \in K$ and $\alpha$ is the zeroof an equation $X^p-X-a=0$ for some $a \in k$.

Proof. Note the Galois group $G$ of $K/k$ is cyclic and $\Tr(-1)=p(-1)=0$, we are able to use the additive form. Let $\sigma$ be the generator of $G$, there exists some $\alpha \in K$ such that

Hence $\sigma(\sigma(\alpha))=\sigma(\alpha+1)=\alpha+1+1$, and by induction we get

and $\alpha$ has $p$ conjugates. Therefore $[k(\alpha):k] \ge p$. But in the meantime

we can only have $[K:k(\alpha)]=1$, which is to say $K=k(\alpha)$. In the meantime,

Hence $\alpha^p - \alpha$ lies in the fixed field of $\sigma$, which happens to be $k$. Putting $a=\alpha^p-\alpha$ and our proof is done. $\square$.

For the case when the character is $0$ please see here. There is a converse, which deserves a standalone blog post. It says that the polynomial $f(X)=X^p-X-a$ either has one root in $k$, in which case all its roots are in $k$; or it is irredcible, in which case if $\alpha$ is a root then $k(\alpha)$ is cyclic of degree $p$ over $k$. But I don’t know if many people are fans of “character $p$” things.

## References

1. Serge Lang, Algbra, Revised Third Edition.
2. Charles A. Weibel, An Introduction to Homological Algebra.
3. Noam D. Elkies, Pythagorean triples and Hilbert’s Theorem 90. (https://abel.math.harvard.edu/~elkies/Misc/hilbert.pdf)
5. Walter Rudin, Fourier Analysis on Groups.

Desvl

2021-10-10

2023-07-08