### Browsed byCategory: Group Theory

Cayley’s Theorem: Powerful Permutations

## Cayley’s Theorem: Powerful Permutations

We’ve discussed before how powerful isomorphisms can be, when we find them. Finding isomorphisms “from scratch” can be quite a challenge. Thankfully, Arthur Cayley proved one of the classic theorems of modern algebra that can help make our lives a bit easier. We’ll explore this theorem and note the power of groups of permutations

Isomorphisms: Making Mathematics More Convenient

## Isomorphisms: Making Mathematics More Convenient

Much of pure mathematics exists to simplify our world, even if it means entering an abstract realm (or creating one) to do it. The isomorphism is one of the most powerful tools for discovering structural similarities (or that two groups are identical structurally) between two groups that on the surface look completely unrelated. In this post, we’ll look at what an isomorphism is.

All the Same Opposites

## All the Same Opposites

###### Editor’s note: see this appendix for supporting proofs.

Fields are among the most convenient algebraic structures, preserving much of the arithmetic we know and love from familiar fields like the rationals $\mathbb{Q}$ and the real numbers $\mathbb{R}$.

Now, it is unnecessary that a set possess infinitely many elements to possibly constitute a field (under the right binary operations). Indeed, Dr. Rachel Traylor recently invited readers to finite field theory by way of GF(4), the field with four elements. In this post, I propose to discuss the simplest of all possible fields, namely $\text{GF}(2)$.

## What is $\text{GF}(2)$?

As Dr. Traylor explains in her post on GF(4), a field is built via a nonempty set $\mathbb{F}$ and two binary operations, typically denoted $+$ and $\cdot$ when no further specifying is needed.  Speaking abstractly, $\text{GF}(2)$ is the case when $\mathbb{F}$ is taken to be any set of two elements, say, $\{a,b\}$, satisfying the operation tables below1.

 $+$ $a$ $b$ $a$ $a$ $b$ $b$ $b$ $a$

 $\cdot$ $a$ $b$ $a$ $a$ $a$ $b$ $a$ $b$

## So, what makes $\text{GF}(2)$ simplest?

Briefly put, there is no field with fewer elements than has $\text{GF}(2)$. Why is this so? The operations $+$ and $\cdot$ each require an identity element, and these must be distinct. As a result, any field must contain at least two elements. And, as it happens, those are enough to define a fully-fledged field.

As it is the most basic of fields, it might be expected that $\text{GF}(2)$ is only trivially interesting, or only appears in coverage of the subject on the theoretical front. (No judgment here; sometimes the simplest examples of structures are illustrative of very little. See the trivial topology or trivial algebra of sets on a set $X$.) However, $\text{GF}(2)$ is the mathematical representation of many dualities we encounter on a daily basis. We will take a look at some of the prominent cases.

## Even & Odd

Let’s denote by ${\bf e}$ and ${\bf o}$ arbitrarily chosen even and odd integers, respectively. Truly, in this way, we are defining two equivalence classes on the set of integers, $\mathbb{Z}$, by way of modulo 2 addition.

Reminder, we say an integer $m$ is even to mean there exists an integer $k$ such that $m=2k$, and $m$ is odd to mean there exists an integer $j$ such that $m=2j+1$. Collect these in a set: $\{{\bf e},{\bf o}\}$, and consider this set under ordinary addition $+$ and multiplication $\cdot$ .

These results are summarized in tables.  For instance, in the $+$ table below, we can think of ${\bf e}+{\bf o}$ as

$$(2k)+(2j+1)=2(k+j)+1\equiv{\bf o},$$

since $k+j$ is also an integer.

 $+$ ${\bf e}$ ${\bf o}$ ${\bf e}$ ${\bf e}$ ${\bf o}$ ${\bf o}$ ${\bf o}$ ${\bf e}$

 $\cdot$ ${\bf e}$ ${\bf o}$ ${\bf e}$ ${\bf e}$ ${\bf e}$ ${\bf o}$ ${\bf e}$ ${\bf o}$

From these tables, ${\bf e}$ serves as additive identity for the field, and ${\bf o}$ the multiplicative identity.

## Binary

Even more readily encountered than even and odd is binary arithmetic. We have a series of posts here on the theory of coding, and all of it rests on $\{0,1\}$ when taken with the operations of $+_2$ (addition modulo 2) and $\cdot$ .

 $+_2$ $0$ $1$ $0$ $0$ $1$ $1$ $1$ $0$

 $\cdot$ $0$ $1$ $0$ $0$ $0$ $1$ $0$ $1$

The similarities to the tables for $(\{{\bf e},{\bf o}\},+,\cdot)$ are evident.  With binary, $0$ is the additive (modulo 2) identity, and $1$ is the multiplicative identity.  This seems to follow naturally.  After all, $0$ is an even integer, and $1$ is an odd integer, so these elements are easily believed to work alike.  With that in mind, I move to an example with a less immediate connection.

## Truth & Falsehood

Now I want to consider the set $\{{\bf T},{\bf F}\}$ of truth values2 true and false, respectively.

It is worthwhile to stop and think on which operations make sense for this set. Two are needed to construct a field. Just as even and odd integers may be combined by adding and multiplying, mathematical statements may be combined via disjunction (“or,” $\vee$), conjunction (“and,” $\wedge$), and implication (“if, then,” $\Rightarrow$).  For this case, I am interested in the special “exclusive or,” also called XOR3, denoted by $\oplus$, and conjunction.

 $\oplus$ ${\bf F}$ ${\bf T}$ ${\bf F}$ ${\bf F}$ ${\bf T}$ ${\bf T}$ ${\bf T}$ ${\bf F}$

 $\wedge$ ${\bf F}$ ${\bf T}$ ${\bf F}$ ${\bf F}$ ${\bf F}$ ${\bf T}$ ${\bf F}$ ${\bf T}$

## Opposite… in Precisely the Same Way

The only thing setting these examples apart is an exchange of symbols. Truly,

$a$, ${\bf e}$, $0$, and ${\bf F}$,

are interchangeable, just as are

$b$, ${\bf o}$, $1$, and ${\bf T}$.

What matters is that these individual elements behave in exactly the same way with respect to their operations.  In the language of algebra, it is said

$(\{a,b\},+,\cdot)$

$(\{{\bf e},{\bf o}\},+,\cdot)$

$(\{0,1\},+_2,\cdot)$, and

$(\{{\bf F},{\bf T}\},\oplus,\wedge)$

are isomorphic, that is, structurally equivalent.

Definition (Isomorphism of fields). We say two fields $(\mathbb{F},+,\times)$ and $(\mathbb{H},\boxplus,\boxtimes)$ are isomorphic to mean there exists a function $\varphi\mathrel{:}\mathbb{F}\to\mathbb{H}$ such that $\varphi$ is one-to-one, onto, and, for all $x,y\in\mathbb{F}$,

$$\varphi(x+y)=\varphi(x)\boxplus\varphi(y);\quad\varphi(x\times y)=\varphi(x)\boxtimes\varphi(y).$$

In the case of $\text{GF}(2)$, proving isomorphism amounts to simply defining the function that swaps the symbols as needed.  For example, to show $(\{{\bf e},{\bf o}\},+,\cdot)$ and $(\{0,1\},+_2,\cdot)$ are isomorphic, define4$\varphi\mathrel{:}\{{\bf e},{\bf o}\}\to\{0,1\}$ by putting $\varphi({\bf e})=0$ and $\varphi({\bf o})=1$.

## Concluding Remarks

Mathematics in many respects is the art5 of extension and generalization. The goal is frequently to take an existing object, assume an aerial viewpoint to learn its structure and what makes it work. (This is often carried out by seeing how few pieces of the whole are really needed to accomplish those key characteristics.)

With the right perspective, even these sets of opposites can bear likeness.  I take comfort in that.

Taking Things for Granted: Elementary Properties of Groups

## Taking Things for Granted: Elementary Properties of Groups

We take a lot of things for granted: electricity, gas at the pump, and mediocre coffee at the office. Many concepts in basic algebra are also taken for granted, such as cancellation of terms, and commutativity. This post will revisit some basic algebra (think solving for $x$), but with some of those things we took for granted removed.

Theory of Coding, Episode 2: Maximum-Likelihood Decoding

## Theory of Coding, Episode 2: Maximum-Likelihood Decoding

The introduction to coding theory in this post will now allow us to explore some more interesting topics in coding theory, courtesy of Pinter’s A Book of Abstract Algebra. We’ll introduce the notion of a code, informations, and parity check equations. Most communication channels are noisy to some extent, which means that a transmitted codeword may have one or more errors. If the recipient of a codeword attempts to decode a message, he needs to be able to determine first if there was a possible error in the code, and then correct it. This will lead us to one method called maximum likelihood decoding.

Group Theory, XOR, and Binary Codes: Introducing Coding Theory

## Group Theory, XOR, and Binary Codes: Introducing Coding Theory

Binary codes and bit-wise operations are fundamental in computer science. Whatever device you are running today works because of binary codes, and the bit-wise operations AND, OR, NOT, and XOR. (A fun exercise, prove to yourself that each of these rules meets the definition of an operation. If you need a refresher, check out this post.) We can also look at binary codes and the bit-wise operations in a more algebraic way.

Let’s continue exploring abstract algebra via Pinter’s delightful algebra book. The previous post discussed operations on sets, and explored concatenation as an example. Here, we will build on this and look at a set $G$ combined with an operation $\ast$ that satisfies three axioms. This operation/set combination is a special algebraic object called a group, and has been studied extensively. We’ll look at a specific group to illustrate the concept: binary codes with the XOR operation.1 (Note: we also looked at modulo arithmetic on finite groups of integers here as another example of a group.)

## Some quick terminology and setup

We currently (until the advent of quantum or DNA computers, or something else entirely) transmit information by coding it into binary strings – strings of 1s and 0s. We call these binary words. 100111 is a binary word, as are 001, 1, and 10110. A general binary word can have any length we want.

When we send these binary words via some communication channel, we always run the risk of errors in transmission. This means we need to devise some way to detect and correct transmission errors in binary codes. One way to do this is the XOR operation.

Let $\mathbf{a} = a_{1}a_{2}\ldots a_{n}$ and $\mathbf{b} = b_{1}b_{2}\ldots b_{n}$  be binary words of length $n$, where $a_{i},b_{i} \in \{0,1\}$ for $i=1,...,n$. We define the XOR operation, which we will denote by the symbol $\oplus$ on each bit. For two bits $a_{i}, b_{i}$ in a word,

$$a_{i}\oplus b_{i} = (a_{i} + b_{i}) \bmod 2$$

(Now would be a good time to review your modulo arithmetic if you’ve forgotten.)

Then for two words $\mathbf{a}$ and $\mathbf{b}$, the XOR operation is done bit-wise (component by component). That is,

$$\mathbf{a} \oplus \mathbf{b} = (a_{1}\oplus b_{1})(a_{2}\oplus b_{2})\ldots (a_{n}\oplus b_{n})$$

As a quick example,

$$110010 \oplus 110001 = 000011$$

Notice that the result of the XOR operation shows us the positions in which $\mathbf{a}$ and $\mathbf{b}$ differ. Another way to look at it is if $\mathbf{a}$ was transmitted, and $\mathbf{b}$ was received, there was an error in the last two positions. We can call the result of XORing two binary words the error pattern.

## Showing Binary Words with the XOR operation is a group

Now we’ll look at the set of all binary words of length $n$, called $\mathbb{B}^{n}$ coupled with the operation XOR.2 We want to show that this set along with the XOR operation forms an algebraic structure called a group. A group is one of the most basic algebraic structures we can study. We will be exploring all sorts of things we can do and build with groups in future posts, so first we need to define a group.

### What’s a group?

From Pinter (1982)

A group is a set $G$ coupled with an operation $\ast$, denoted $\langle G, \ast \rangle$ that satisfies the following three axioms:

(G1: Associativity of the operation) The operation $\ast$ is associative. (See the previous post for a review of associativity.)

(G2: Existence of an identity element) There is an identity element inside the set $G$ that we will call $e$ such that for every element $g \in G$, $e\ast g = g\ast e =g$

(G3: Existence of an inverse for every element) For every element $g \in G$, there is a corresponding element $g^{-1} \in G$ such that $g\ast g^{-1} = g^{-1}\ast g = e$

All three properties were discussed in the previous post. It’s important to note that a group is a set and an operation. If we change one, then we either have a different group, or we lose the group classification. Real numbers under addition are a group, as are nonzero real numbers under multiplication, but those are two different groups. (This will be important when we get to rings and fields.) Integers under addition are also a group, but a different one than real numbers under addition.

### Let’s prove $\langle\mathbb{B}^{n}, \oplus \rangle$ is a group.

Showing a set and operation is a group is pretty algorithmic, in a sense. We just have to show that all three axioms are satisfied.3

(G1): Associativity

This one will be a bit tedious. We have to show associativity for words of any length $n$. But fear not. Since XOR of words is done bit wise, we can exploit that and first show associativity for binary words of length 1, then “scale it up”, if you will.

In this case, for words of length 1, we just have to brute-force it. We have to show that for any $a,b,c \in \mathbb{B}^{1}$, that

$$(a\oplus b) \oplus c = a \oplus (b \oplus c)$$

So,

\begin{aligned} 1\oplus (1\oplus 1) = 1 \oplus 0 = 1 &\text{ and } (1 \oplus 1) \oplus 1 = 0 \oplus 1 = 1\\ 1\oplus (1 \oplus 0) = 1\oplus 1 = 0 &\text{ and } (1\oplus 1) \oplus 0 = 0 \oplus 0 = 0\\ &\vdots\end{aligned}

Continue in this fashion until you have tried all combinations. (It does work out.)

Now that we have that this is true for words of length 1, we just use the definition of XOR operation on words of length 1 to “scale up” to words of length n. Since the operation is done component-wise, and it is associative on each component, it is associative on the whole word. We’ll show this formally now:

Let $\mathbb{a,b,c} \in \mathbb{B}^{n}$. So $\mathbf{a} = a_{1}a_{2}\ldots a_{n}$, $\mathbf{b} = b_{1}b_{2}\ldots b_{n}$, and $\mathbf{c} = c_{1}c_{2}\ldots c_{n}$. Then

\begin{aligned}\mathbf{a}\oplus (\mathbf{b} \oplus \mathbf{c}) &= a_{1}a_{2}\ldots a_{n}\oplus [(b_{1}\oplus c_{1})(b_{2}\oplus c_{2})\ldots (b_{n}\oplus c_{n})]\\ &= (a_{1} \oplus (b_{1} \oplus c_{1}))(a_{2} \oplus (b_{2} \oplus c_{2}))\ldots (a_{n} \oplus (b_{n} \oplus c_{n}))\\&= ((a_{1} \oplus b_{1})\oplus c_{1})((a_{2}\oplus b_{2})\oplus c_{2})\ldots ((a_{n}\oplus b_{n})\oplus c_{n})\\&= (\mathbf{a} \oplus \mathbf{b})\oplus \mathbf{c}\end{aligned}

That third equality holds because we already showed that XOR was bit-wise associative. The last equality just recalls what it means to XOR two binary words. With that, we have shown associativity of the XOR operation.

(G2): Existence of an identity element

When we want to show that a group has an identity element, we must actually find a candidate and show it meets the criterion. Here is (frustratingly, sometimes), where intuition and experience tend to play the biggest role. But, as my mother always told me when I was frustrated at my middle school math homework: “Make it look like something you’ve seen before”. XOR is bitwise addition, just with a twist (add then mod out by 2). So let’s start by considering the identity element for addition: 0. We’re looking at binary words of length $n$, so a good candidate for our identity element $e$ would be a string of $n$ 0s. But does it fit?

Any word

\begin{aligned}a_{1}a_{2}\ldots a_{n} \oplus 000\ldots 0 &= (a_{1}\oplus 0)(a_{2} \oplus 0)\ldots (a_{n} \oplus 0)\\&= a_{1}a_{2}\ldots a_{n}.\end{aligned}

You can check also that $0 \oplus \mathbf{a} = \mathbf{a}$, and thus our candidate is a match!4

(G3): Existence of an inverse element for every element in the set

This one is a little bit trickier. We need to be able to find any generic binary word, and show that there is another binary word such that when we XOR them together, we get the sequence of all 0s. (Computer science friends are ahead on this one.)

Think back to how we looked at the XOR operation as a form of error checking. If we XORed two words together, there was a 1 in every position in which they differ, and a 0 in every position in which they were identical.

Therefore, we come to the interesting conclusion that every element is its own inverse! If you XOR an element with itself, you will get a sequence of 0s. With our error checking interpretation, this makes perfect sense. We know the communication line transmits perfectly if we can XOR the sent and received word and get all 0s every time.

Conclusion: every element is its own inverse, and every element in the group is, well, in the group, so we have satisfied the third axiom.

Therefore, we have a group, fellow math travelers!

## Bonus round: showing that $\langle \mathbb{B}^{n}, \oplus \rangle$ is an abelian group

Notice that we are missing one property that all of you likely take for granted in regular arithmetic: being able to add and multiply in any order you like. That is, you take for granted that 2+3 = 3+2. This property is called commutativity, and groups with this bonus property are called abelian groups. Not all groups are abelian. Square matrices under matrix multiplication come to mind. However, we will show that $\langle \mathbb{B}^{n}, \oplus \rangle$ is an abelian group.

This actually can be done quite simply by exploiting two things we already know: modulo arithmetic and bit-wise operations. We know that XOR is basically bit wise addition modulo 2. Addition is absolutely commutative, and modding out by 2 doesn’t change that, since we add in the regular ol’ way, then divide by 2 and take the remainder.

That means that XORing binary words of length 1 is commutative. Now, since XORing is done bit-wise, which we noted while proving associativity, we can use that same reasoning again (in fact, it looks almost exactly the same), to conclude that XOR is indeed a commutative operation, and thus we have an abelian group.5

## Conclusion

We explored the definition of a group using a common operation and set from computer science. We needed this foundation to be able to study more. The next post in the coding theory and algebra series will take an in depth look at maximum likelihood decoding. This decoding process is a way to attempt to decode and correct errors in transmission when the communication channel is noisy (and most are, somewhat).

Abstract algebra is a powerful branch of mathematics that touches many more things than most of us realize. Now that we know binary words under the XOR operation is a group, we can start extending this and studying it as a structure, or skeleton, rather than focusing on the individual elements. The elements and operation are really just the drywall or stucco on a house. The algebraic structure is the skeleton of beams inside the house. We learn a lot more by studying the structure than the covering, which is why it is important to enter that abstract realm. We will uncover similarities in strange places, and applications we didn’t know were possible.