# A Partition by any Other Name

I promise I’m actually a probability theorist, despite many of my posts being algebraic in nature. Algebra, as we’ve seen in several other posts , elegantly generalizes many things in basic arithmetic, leading to highly lucrative applications in coding theory and data protection. Some definitions in mathematics may not have obvious “practical use”, but turn out to yield theorems and results so powerful we can use them to send image data cleanly from space.

This is a continuation of the exploration of the algebra behind many aspects of coding theory and data protection, and we will continue to develop the fundamentals necessary to appreciate the elegance and simplicity of code transmission and data protection. Here we will discuss the notion of a **coset** as a subgroup of a finite group G. For a discussion on the definition of a group, please check out this post. Recall that a group is a set G combined with an operation we will denote \cdot that satisfies the following axioms:

(1) **Closure under \cdot: **If a and b are in G, then their product (or sum)^{1} a\cdot b is in G

(2) **Associativity:** Let a,b and c be elements of G. Then we can group the operation however we like. That is,

(3) **Existence of an identity element:** There must be some element e in G that, when multiplied on either the left or right by any element a in G, returns that element. Mathematically, there exists e \in G such that for any a \in G
e\cdot a = a\cdot e = a

(4) **Existence of an inverse: **For each element a in G, there is another element a^{-1} in G such that multiplying a^{-1} on either the left or right by a yields the identity element e. Formally, for each a \in G there exists a^{-1} in G such that

Some quick examples of groups:

- (\mathbb{R}, +), all real numbers under regular addition. The identity element is 0, and the inverse of any real number a is -a.
- \mathbb{Z}_{n}, integers modulo n under modulo addition.
- n \times n matrices under matrix addition

The number of elements in a group G can be either finite or infinite and is denoted |G|. In \mathbb{Z}_{3} = \{0,1,2\}, |\mathbb{Z}_{3}| = 3. |\mathbb{R}| is infinite^{2}. We’re going to stick with finite groups; that is, groups with a finite number of elements.

## Subgroups

A **subset **T of a set S is simply a set made from some of the elements in S. If S = \{0,1,2,3,4,5\}, then T = \{0,2,4\} is a subset of S, and we denote it T \subseteq S. The even integers are a subset of all integers. The rational numbers area subset of all real numbers. The set of square matrices is a subset of all matrices.

If our set has an operation and is actually a group G, then a subset of this group that also fits the definition of a group is called a **subgroup, **and we write H \leq G. Not all subsets of a group are subgroups. To determine if a subset of a group is actually a subgroup, we just have to verify properties (1) and (4) above for the subset.

Let’s focus on finite groups for now. Obviously, we can have subgroups of infinite groups, but it’s easier to get used to these ideas on finite groups first. One possible way to construct a subgroup of a group is to pick an element a \in G. We can take powers of this element and collect them together in a nice bag to create a subgroup called H, or for fun, denote it \langle a \rangle H = \langle a\rangle = \{a, a^{2} = a\cdot a, a^{3} = a\cdot a \cdot a,\ldots \}

Now, are we ever done taking powers of a? Actually, yes. In a finite group, there are only a finite number of elements total, so we have to stop taking powers at some point. Moreover, we’ll get to the point where some power of a actually gives us the identity element e, and then any further powers just start us over. Let’s take an explicit example.

Suppose we take \mathbb{Z}_{6} = \{0,1,2,3,4,5\} under modulo addition. Pick 2. Now let’s see what happens when we create a subgroup from powers of 2 under modulo 6 addition.

First, we just have 2. Then 2^{2} = (2+ 2) \bmod 6 = 4^{3}. Next, 2^{3} = (2+2+2) \bmod 6 = 6 \bmod 6 \equiv 0. Hey look! Our identity element. That means that 2^{4} = 2^{3}\cdot 2 = 0 + 2 = 2. Now we started over. That means there are only 3 distinct powers of 2 under modulo 6 addition: H = \langle 2\rangle = \{0,2,4\}. This is the **subgroup generated by 2**, which is why we use the notation \langle 2\rangle, because 2 and its powers under this group operation constructs the subgroup.

We call |\langle a \rangle | the **order** of a. Put another way, the order of an element in a group is the power required to yield the identity element. So, under modulo 6, 2^{3} \equiv 0, so the order of 2 is 3, which is the size of the subgroup generated by 2.

## Coset Decomposition using Subgroups

We can take a subgroup of a *finite group* G and partition it **relative to that subgroup** in a nice, methodical way^{4}. Let’s take a subgroup generated by some element h \in G. That is, let our subgroup H = \langle h \rangle = \{h_{1}, h_{2},...,h_{c} = e\} where c is the order of h. We know the order of the element must be finite, because the group itself is finite, and all groups are closed under operations. Now let’s construct a pretty array:

First, we’ll put the elements of our subgroup H on the first row:

\begin{array}{ccccc}h_{1}&h_{2}&h_{3}&\ldots&h_{c}\end{array}Now, pick any g \in G you want that *is not* in the subgroup H. Let’s name it g_{2}. Put it at the start of the second row, and each element of the second row is formed by multiplying this g_{2} on the left by the column header h_{i}:

That was easy enough. Now pick another g_{3}\in G that hasn’t shown up in the first two rows. Then we’ll create the third row in the same way as we created the second:

\begin{array}{ccccc}h_{1}&h_{2}&h_{3}&\ldots&h_{c}\\g_{2}&g_{2}\cdot h_{2}& g_{2} \cdot h_{3}&\ldots&g_{2} \cdot h_{n}\\g_{3}&g_{3}\cdot h_{2}& g_{3} \cdot h_{3}&\ldots&g_{3} \cdot h_{n}\end{array}We may continue making the array in this way until all of our elements are used up. Then we end up with the elements of our group arranged nicely into an m \times n array:

\begin{array}{ccccc}h_{1}&h_{2}&h_{3}&\ldots&h_{c}\\g_{2}&g_{2}\cdot h_{2}& g_{2} \cdot h_{3}&\ldots&g_{2} \cdot h_{n}\\g_{3}&g_{3}\cdot h_{2}& g_{3} \cdot h_{3}&\ldots&g_{3} \cdot h_{n}\\\vdots&&&&\\g_{m}&g_{m}\cdot h_{2}& g_{m} \cdot h_{3}&\ldots&g_{m} \cdot h_{n}\end{array}Each row in this array forms what we call a **left coset of H in G. **You take an element of the group and multiply each element of the subgroup on the left by that element. Notice that for a finite group, we have m left cosets. If we had multiplied those g_{j}‘s on the right, we would call the rows **right cosets.** If the operation is commutative (a\cdot b = b\cdot a), then left and right cosets are the same thing.

We can prove that every element in a finite group shows up in this array, and that no element appears more than once in the array^{5}. What we’ve actually done is **partitioned** a finite group into these rows using our subgroup H. This partitioning is called a **coset decomposition. **

(*Remark. In fact, a coset on a finite group is a type of equivalence relation. This very elegant fact will play a large role in coding theory syndromes in a future post. For a discussion of equivalence relations please see the video below:)*

### LaGrange’s Theorem

For finite groups, this coset decomposition yields LaGrange’s theorem, a particularly beautiful theorem that relates the size of a subgroup, the number of cosets of that subgroup, and the size of a group. Because mathematicians hate writing more than we have to, let’s write the number of left (or right) cosets of H in G as [G:H].

**Lagrange’s Theorem**

Suppose |G| is finite, and H \leq G^{6}. Then

The number of cosets with respect to H times the size of H gives us the size of the group. Take a look back at our array we created. That’s a visual illustration of this theorem. The number of rows gives us the number of cosets, and the number of columns is the size of our subgroup H. All group elements appeared in the array, so we can see that the number of cosets times the size of the subgroup gives us the size of the original group G.

## Conclusion

Cosets exist for infinite groups as well, though they cannot be constructed quite the same way we did above, since we would never “finish” filling up rows and columns. For those, we just create them by multiplying some element g \in G that isn’t in a subgroup H (which may not be finite anymore) on the left or right by all elements in the subgroup. We write a general left coset as

gH = \{gh : h \in H\}The product of that g with all elements of the subgroup H. If the group is infinite, we don’t necessarily get a coset decomposition like we had before, and LaGrange’s Theorem doesn’t apply anymore.

Sticking with finite groups can yield some extremely powerful uses though. We’ll need all of this (and some linear algebra) when we discuss how to store and decode a binary code that may be transmitted over a noisy channel. The fact that the coset decomposition yields a set of equivalence classes will allow us to drastically cut the amount of bits we need to store to fully represent a binary code and its parity check equations. This will allow us to use very large block length for code transmission and data protection, as we will see in future posts.

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

#### Footnotes

- Remember that operations aren’t just the typical addition or multiplication you use every day. We use addition and multiplication as general operations, and the product is just the name for one element “operation” another.
- There are different kinds of infinite, and different sizes of infinity. Cardinality of infinite sets gets weird, and we’ll explore this later.
- Remember that the operation is addition, so 2 “operation” 2 is 2+2 modulo 6
- This only works for finite groups
- I’m not going to prove this here. This proof is done by contradiction, and is useful and necessary, but not particularly illuminating to this discussion
- Remember, that’s our notation for “H is a subgroup of G”