This site is being phased out.

# Products

## Contents

## How products are built

The idea of the product may be traced to the image of a stack, which is a simple arrangement of multiple copies of $X$:

More complex outcomes result from attaching to every point of $X$ a copy of $Y$:

As an example from linear algebra, what does the following identity mean?
$${\bf R} \times {\bf R} = {\bf R}^2 ?$$
We can think of it as if a copy of the $y$-axis is attached to every point on the $x$-axis. Or, we can think in terms of *products of sets*:
$$x\in{\bf R}, y\in{\bf R} \Rightarrow (x,y)\in{\bf R} \times {\bf R}.$$
Generally, for any two sets $X$ and $Y$, their product set is defined as the set of ordered pairs taken from $X$ and $Y$:
$$X \times Y := \{(x,y): x\in X, y\in Y\}.$$

Now, it is important to keep in mind that these three sets are just *bags of points*. How do these points form something tangible? Before we provide the definition of the topology, we consider a few examples of visualization of products.

**Example (square).** Let
$$[0,1] \times [0,1] := {\bf I} \times {\bf I}= {\bf I}^2$$

You can see how a copy of $Y={\bf I} $ is attached to every point of $X = {\bf I}$, and vice versa.

$\square$

The building blocks here are just subsets of ${\bf R}$ and the construction simply follows that of ${\bf R} \times {\bf R}$.

**Exercise.** Provide a similar sketch for the cube:
$${\bf I} \times {\bf I} \times {\bf I} = {\bf I}^3 .$$

**Example (cylinder).** Consider
$${\bf S}^1 \times {\bf I}. $$

To build the cylinder this way, we place the circle ${\bf S}^1$ on the plane ${\bf R}^2$ and then attach a copy of $[0,1]$, vertically to each of its points.

$\square$

In all of these cases, the product fits into ${\bf R}^2$ or ${\bf R}^3$ and is easy to visualize. What if both sets are in ${\bf R}^2$?

**Example (torus).** A similar procedure for the torus:
$${\bf S}^1 \times {\bf S}^1 = {\bf T}^2$$
is impossible. As both copies of ${\bf S}^1$ lie in ${\bf R}^2$, the product would lie in ${\bf R}^4$. One, however, might think of small circles attached, vertically, to each point of the large circle on the plane.

$\square$

**Example (thickening).** First, let's observe that we can understand the product
$${\bf S}^1 \times {\bf I} $$
not as a cylinder but as a ring (annulus), if we attach the copies of $Y$ in a different manner:

The idea is that the product of a space with the segment ${\bf I}$ means “thickening” of the space. For example, the product $${\bf S}^2 \times {\bf I} $$ is a thickened sphere:

$\square$

We have followed the rule:

- attach a copy of $Y$ to every point of $X$ (or vice versa).

Now, the ways these copies are attached aren't identical! We do see how this gluing is changing as we move in $X$ from point to point. The change, however, is *continuous*. Is continuity enough to pinpoint what the topology of the product is? The example below shows that the answer is No.

**Example.** Just as with the cylinder, we are attaching segments to the circle but with a gradual turn. The result is, of course, the Mobius band.

$\square$

**Exercise.** Is the above rule violated in this construction?

**Example (surgery).** Recall that the connected sum takes two surfaces and attaches them to each other in two steps:

- punch a hole in either, and then
- glue the edges of the hole to each other.

There is also a $1$-dimensional analogue:

Alternatively, we can look at this construction as one applied to a single manifold, either made of two pieces or not. Now, relying on homotopy equivalence, one can see this modification of the manifold as a replacement of its part:

The construction is called the *surgery*. For surfaces, we replace two disks with a cylinder. Note that together these three surfaces form the boundary of a solid cylinder, ${\bf B}^2 \times {\bf I}$. Then we can see the surgery as a replacement of a certain product with another product
$$\partial {\bf I} \times {\bf B}^2 \leftrightarrow {\bf I} \times \partial {\bf B}^2 .$$
Such an operation is possible because the boundaries of these two sets are homeomorphic.

$\square$

In a $(n+m-1)$-manifold the surgery is this replacement: $$\partial {\bf B}^m \times {\bf B}^n \leftrightarrow {\bf B}^m \times \partial {\bf B}^n .$$

**Exercise.** Provide illustrations for surgeries with $n+m \le 4$.

## Products of spaces

What is the meaning of this identity: $${\bf R} \times {\bf R} = {\bf R}^2 ?$$

In linear algebra, the question arises because vector operations make sense with or without coordinates. One then needs to demonstrate how we can define the algebraic operations on the product set ${\bf R} \times {\bf R}$ in terms of the operations on either copy of ${\bf R}$, so that we have an isomorphism: $${\bf R} \times {\bf R} \cong {\bf R}^2 .$$ (The answer is easy: $(a,b) + (a',b') = (a + a',b + b'),\ t(a,b) = (ta,tb).$)

Following this lead, we would like to demonstrate how we can define the topology on the product set ${\bf R} \times {\bf R}$ in terms of the topology on either copy of ${\bf R}$, so that we have an homeomorphism: $${\bf R} \times {\bf R} \approx {\bf R}^2 .$$

The question is: what is the relation between the topology of the $xy$-plane and the topologies of the $x$- and $y$-axes? The question appears indirectly in elementary calculus:

- $f(\cdot,\cdot)$ is continuous iff $f(\cdot,b)$ and $f(a,\cdot)$ are continuous for each $a,b$.

However, this isn't the definition of continuity of a function of two variables (that would make it dependent on the choice of the Cartesian system) but rather a theorem. Its proof reduces to the question of convergence:

- $(x_n,y_n) \to (a,b)$ iff $x_n\to a$ and $y_n\to b$.

But, the convergence in ${\bf R}^2$ is based on the topology generated by disks, while that of ${\bf R}$ is based on intervals. The solution was demonstrated previously: the Euclidean topology of the plane coincides with the topology generated by *rectangles*:

And those rectangles are simply products of the intervals that come from the two copies of ${\bf R}$. Taking all pairwise products of intervals from the first copy of ${\bf R}$ and all from the second copy gives us the right topology on ${\bf R}^2$.

This analysis motivates the definition.

**Definition.** Given two topological spaces $X$ and $Y$. Then the *product* $X \times Y$ of $X$ and $Y$ is a topological space defined on the product set $X \times Y$ with the following basis:
$$\{U \times V : U \text{ open in } X,\ V \text{open in } Y \}.$$
The basis generates a topology called the *product topology*.

Pairwise products of the standard bases of the spaces we saw in the last subsection are shown below:

Now, as always, is this well-defined? In other words, is the collection defined this way always a basis?

**Theorem.** Given spaces $X$ and $Y$ with bases $\tau$ and $\sigma$ respectively, the collection
$$\gamma := \{V \times W : V \in \tau, W \in \sigma \}$$
is a basis on $X \times Y$.

**Proof.** We need to verify the two axioms of basis.

For (B1), consider $$\begin{array}{llllll} \bigcup \gamma &= \bigcup \{V \times W : V \in \tau,\ W \in \sigma \} \\ &= \bigcup \{V : V \in \tau \} \times \{W \colon W \in \sigma \} \\ &= X \times Y, \end{array}$$ since $\tau$ and $\sigma$ satisfy (B1).

For (B2), suppose we have two neighborhoods $A,A'$ in $X \times Y$ and a point $u$ that belongs to their intersection: $$\begin{array}{llll} A &= V \times W, & V \in \tau, W \in \sigma,\\ A' &= V' \times W', & V' \in \tau , W' \in \sigma,\\ u &= (x,y) \in A \cap A'. \end{array}$$ Then the construction follows this outline:

The last assumption implies that $$x \in V \cap V', \ y \in W \cap W'.$$ Then, since $\tau$ and $\sigma$ satisfy (B2), we conclude:

- there is $Q \in \tau$ such that $x \in Q \subset V \cap V'$, and
- there is $P \in \sigma$ such that $y \in P \subset W \cap W'$.

Therefore, $$u = (x,y) \in Q \times P \subset V \times W \cap V' \times W' = A \times A'. $$ $\blacksquare$

It is crucial to demonstrate that the product topology matches the implied topology of the spaces we build in the last subsection.

**Exercises.** Prove the following homeomorphisms:

- $n$-point set $\times m$-point set $\approx$ $nm$-point set;
- ${\bf R} \times \{x\} \approx {\bf R}$;
- ${\bf R}^n \times {\bf R}^m \approx {\bf R}^{n+m}$;
- ${\bf I} \times {\bf I}\approx {\bf I}^2 =$ square;
- ${\bf I}^n \times {\bf I}^m \approx {\bf I}^{n+m}$;
- ${\bf S}^1 \times {\bf I} \approx$ the cylinder;
- ${\bf S}^1 \times {\bf S}^1 \approx {\bf T}^2 =$ the torus.

**Exercises.** Prove that the following is equivalent to the original definition:

- given two topological spaces $X$ and $Y$ with bases $\sigma$ and $\tau$ respectively, the product $X \times Y$ is a topological space with the following basis:

$$\{U \times V : U \in \sigma ,\ V \in \tau \}.$$

**Exercise.** Prove that the metric $d$ of a metric space $(X,d)$ is continuous as a function $d:X\times X\to {\bf R}$ on the product space.

**Exercise.** Prove:
$$X \times Y \approx Y \times X.$$

**Exercise.** Prove that the product of an $n$-manifold and an $m$-manifold is an $(n+m)$-manifold.

## Properties

Many topological properties are preserved under products.

**Theorem.** If both $X$ and $Y$ are Hausdorff then so is $X \times Y$.

**Proof.** Suppose we have two distinct points $(x,y)$ and $(u,v)$ and we need to separate them by two disjoint neighborhoods.

There are two possibilities.

- (1) If $x$ and $u$ are distinct, they can be separated by $U$ and $V$ in $X$.
- (2) If $y$ and $v$ are distinct, they can be separated by $U'$ and $V'$ in $Y$.

Then:

- if both (1) and (2) hold, $(x,y)$ and $(u,v)$ are separated by $U \times U'$ and $V \times V'$;
- if it's (1) but not (2), they are separated by $U \times Y$ and $V \times Y$;
- if it's (2) but not (1), they are separated by $X \times U'$ and $X \times V'$.

$\blacksquare$

**Theorem.** If both $X$ and $Y$ are path-connected then so is $X \times Y$.

**Proof.** Suppose we have two points $(u,v)$ and $(x,y)$ in $X \times Y$, then we need to find a path from one to the other. From the path-connectedness of $X$ and $Y$, it follows that there are continuous functions

- $f : [0,1] \to X$ with $f(0) = u, f(1) = x$, and
- $g : [0,1] \to Y$ with $g(0) = v, g(1) = y.$

Then the function

- $q : [0,1] \to X \times Y$ defined by $q(t) := (f(t), g(t))$

gives us the path.

$\blacksquare$

**Exercise.** Prove that $q$ is continuous.

More general is the following.

**Definition.** Given two functions
$$f:Z\to X,\ g:Z\to Y,$$
the *product function* (or map)
$$f \times g: Z\to X\times Y$$
is given by
$$(f\times g)(z):=(f(z),g(z)).$$

**Theorem.** If both $f,g$ are continuous then so is their product $f\times g$.

**Exercise.** Prove the theorem.

**Exercise.** Even more general is the product of maps $f:A\to X,\ g:B\to Y$. Define it, show that it includes the last definition, and prove its continuity.

Recall that the graph of a map $f:X\to Y$ is a subset of $X\times Y$: $$\operatorname{Graph}(f):=\{(x,y)\in X\times Y:y=f(x)\}.$$

**Exercise.** Prove that
$$\operatorname{Graph}(f)\approx X.$$

In particular, for the identity map $\operatorname{Id}:X\to X$, the graph coincides with the *diagonal*:
$$\operatorname{Graph}(\operatorname{Id})=\operatorname{D}_X:=\{(x,y)\in X\times X:\ y=x\}.$$

Next, is compactness preserved under products?

**Exercise.** Prove that the answer is Yes for compact subsets $X,Y$ of Euclidean spaces. Hint: you'll have to prove first that the product of two closed subsets is closed.

At least we know that $X \times Y$ is “made of” compact spaces:

Indeed, $$X \times Y = \bigcup _{y\in Y} X_y,$$ where, for each $y\in Y$, $$X_y:=X\times \{y\} \subset X \times Y$$ is homeomorphic to $X$, which is compact.

**Exercise.** Show that this doesn't imply that $X\times Y$ is compact, unless $Y$ is finite.

We start over. For the general case, we will try to apply the same approach as above: split the problem to those for $X$ and $Y$, solve both separately, combine the solutions to form a solution for the product. This time, the problem is that of finding a finite subcover of an open cover. Unfortunately, the process of creating a basis of $X\times Y$ from those of $X$ and $Y$, as shown above, cannot be reversed. For example, the intervals in ${\bf R}$ and the disks in ${\bf R}^2$ produce cylinders in ${\bf R}^3$ not balls. So, for the plan to work, we need to show that compactness holds even if we only deal with open covers of a particular kind.

**Theorem.** A topological space $X$ is compact if and only if there is a basis $\beta$ of its topology so that every cover of $X$ by members of $\beta$ has a finite subcover.

**Exercise.** Prove the theorem.

We start over. Suppose $\beta _X$ and $\beta _Y$ are bases of $X$ and $Y$, respectively. According to the last theorem, we only need to prove that every cover of $X\times Y$ by members of the basis $$\beta _{X\times Y}:= \beta _X\times \beta _Y= \{U\times V:\ U\in \beta_X,\ V\in \beta_Y \}$$ has a finite subcover. Suppose $\gamma$ is such a cover. Since every of its elements is a product, we have $$\gamma = \{U\times V:\ U\in \gamma_X\subset \beta_X,\ V\in \gamma_Y\subset\beta_Y \},$$ for some covers $\gamma_X,\gamma_Y$. Now we find a finite subcover in either of them, say, $\gamma'_X,\gamma'_Y$, and let $$\gamma':=\gamma_X \times \gamma_Y,$$ which is a finite subcover of $\gamma$.

**Exercise.** Find a flaw in this argument.

We start over.

**Theorem.** If both $X$ and $Y$ are compact then so is $X \times Y$.

**Proof.** The idea is to use the compactness of every $X_b,\ b\in Y,$ to come up with an open neighborhood $V_b\subset Y$ of $b$, and then use the compactness of $Y$.

The first stage is illustrated below:

Suppose $\beta _X$ and $\beta _Y$ are bases of $X$ and $Y$, respectively. Suppose $\gamma$ is a cover of $X\times Y$ the elements of which are products of the elements of the two bases: $$\gamma = \{W=U\times V\},\ U\in \beta_X,V\in \beta_Y.$$ Choose any $b \in Y$. Then $\gamma$ is a cover of $Y_b$. Since $Y_b$ is compact, there is a finite subcover $\gamma_b$ of $\gamma$. We then let $$V_b := \bigcap \gamma_b.$$ As a finite intersection of open sets, $V_b$ is open.

In the second stage, we first consider an open cover of $Y$: $$\alpha := \{V_b:b\in Y\}.$$ Then, since $Y$ is compact, we can find its finite subcover: $$\alpha ' = \{V_b:b\in F\},$$ where $F$ is some finite subset of $Y$. Finally, we set: $$\gamma':=\{U \times V \in \gamma :\ U \times V \in \gamma_b, b\in F \}.$$ $\blacksquare$

**Exercise.** Prove that $\gamma'$ is a finite subcover of $\gamma$. Hint:

Further, one can easily define and study products of finitely many spaces but products of an infinite number of spaces are also possible.

## Projections

Just as the relative topology comes with the inclusion map and the quotient topology comes with the identification map, the product topology also comes with a new map - the projection.

Let's start with the simple projection of the $xy$-plane on the $x$-axis, $$p: {\bf R}^2 \to {\bf R}.$$ It is given by $p(x,y) = x$:

Observe this:

- for each $b$, the function $p(x,b) = x$ is continuous as the identity;
- for each $a$, the function $p(a,y) = a$ is continuous as a constant function.

Therefore, as we know from calculus, $p$ is continuous.

More generally, if we have the product of two topological spaces we can define the projection in the same manner.

**Definition.** Suppose $X,Y$ are two topological spaces. The *projections*
$$p_X : X \times Y \to X ,\ p_Y : X \times Y \to Y,$$
of $X\times Y$ on $X$ and $Y$ respectively are defined by:
$$p_X(x,y) := x,\ p_Y(x,y) := y.$$

**Theorem.** The projections are continuous.

**Proof.** Suppose $V$ is open in $X$. Then $p_X^{-1}(V) = V \times Y$, which is open in $X \times Y$ in the product topology, since $Y$ is open in $Y$. $\blacksquare$

**Exercise.** What is the linear algebra analog of this statement?

The projection of the cylinder on the circle “looks” continuous -- one can imagine an old chimney collapsing to the ground:

What the projection of torus $p : {\bf T}^2 \to {\bf S}^1$ is might not be as obvious because it might seem that in order to map it onto the equator you'd have to tear it:

**Exercise.** Sketch the projection of the torus onto one of its meridians.

The projection of the square ${\bf I}^2=[0,1] \times [0,1]$ to the $x$-axis is continuous as the restriction of the map $p$ above. In fact, any subset in the plane can be projected to the $x$-axis:

This is the reason why the restrictions of the projection are also called by the same name. The idea is suggested by that of a *shadow*, such as the one of this arrow:

**Exercise.** Describe projections with a commutative diagram. Hint: consider inclusions.

**Theorem.**
$$(X \times Y) / _Y \approx X.$$

**Exercise.** Prove the theorem.

A more general concept is that of a *self-projection*, which is any self-map $P:X \to X$ that satisfies $PP=P$.

**Exercise.** Assuming that a self-projection $p$ is a realization of a cell map $P$, what can you way about $\det P_*$?

**Exercise.** Suggest a non-constant self-projection of the Mobius band.

## Products of complexes

If we are able to decompose a topological space into the product of two others, $Z=X\times Y$, we expect $X$ and $Y$ to be simpler than $Z$ and help us understand $Z$ better. An example is the torus as the product of two copies of the circle, which reveals the two tunnels. To make this idea more precise, we need to apply it to spaces that are realizations of complexes. The hope is that our data structure will follow the topology: $$|K\times L|\approx |K|\times |L|,$$ and we may be able to express the homology groups of $K\times L$ in terms of those of $K$ and $L$. To start on this path, we need to define the product of two complexes.

The construction should be very simple: for two complexes $K,L$, let $$K\times L:=\{a\times b:\ a\in K, b\in L\}.$$ These are pairwise products of every cell of $K$ and every cell in $L$, of all (possible different) dimensions! What's left is to define the product of two cells.

Simplicial complexes have proven to be the easiest to deal with, until now. The problem we encounter is at the very beginning: the product of two simplices isn't a simplex! It's a prism:

Then, we can't study the product $K\times L$ of simplicial complexes without further triangulation.

**Exercise.** Triangulate the prism.

Fortunately, *cubical complexes* have no such problem because the product of two cubes is a cube! In fact, the product of an $n$-cube and an $m$-cube is an $(n+m)$-cube:
$${\bf I}^n \times {\bf I}^m = {\bf I}^{n+m}.$$
In other words, if $a$ and $b$ are $n$- and $m$-cells respectively, then $a \times b$ is an $(n+m)$-cell.

**Example (segment times segment).** Suppose we have two copies of the complex that represents the segment:

Then $K$ has $3$ cells:

- $0$-cells: $A$ and $B$,
- $1$-cell: $a$;

and $L$ has $3$ cells too:

- $0$-cells: $C$ and $D$,
- $1$-cell: $b$.

Now, to compute the product of $K$ and $L$, we take a cell $x$ from $K$ and a cell $y$ from $L$, and add their product $x \times y$ to the list. Then $K \times L$ has $3 \times 3=9$ cells:

- $0$-cells: $A \times B,\ A \times C,\ B \times C,\ B \times D$;
- $1$-cells: $A \times b,\ B \times b,\ a \times C,\ a \times D$;
- $2$-cells: $a \times b$.

What we have is the complex of the square.

$\square$

**Example (hollow square and segment).** More complicated is the product of the complexes of a hollow square and a segment:

We have for $K \times L$:

- $8$ $0$-cells,
- $8$ $1$-cells,
- $4$ $2$-cells.

This is the finite analogue of the topological product of the circle and the segment, ${\bf S}^1 \times {\bf I}$. The result is the cylinder, as expected.

$\square$

**Exercise.** Finish the example.

Back to simplicial complexes. It is so easy to build new simplices from old! For example, adding a new vertex gives as several:

It is simply a matter of adding the new vertex to the list, which defines a certain operation of simplices:

- $A \cdot C=AC$;
- $B \cdot C=BC$;
- $a \cdot C=AB \cdot C=ABC=aC$.

How do we build an $(n+1)$-simplex from an $n$-simplex? Suppose we have $n$ geometrically independent points $A_0 ,...,A_n \in {\bf R}^N$ that represent $n$-simplex
$$\sigma := A_0 ...A_n.$$
Suppose another vertex $A_{n+1} = Q$ is also available. If the vertex is geometrically independent from the rest, we have a desired $(n+1)$-simplex
$$\tau := A_0 ...A_nA_{n+1}.$$
This is called the *cone construction*.

Meanwhile, on the data side, we are simply adding a new element to the list of vertices of the original simplex, as above.

The construction suggests a simple idea of how to define “products” of simplices -- just combine their lists of vertices!

**Definition.** The *join* of an $n$-simplex $\sigma = A_0 ... A_n$ and an $m$-simplex $\tau = B_0 ... B_m$ is an $(n+m-1)$-simplex
$$\sigma \cdot \tau := A_0 ... A_nB_0 ... B_m.$$

For example, the $3$-simplex is the join of two $2$-simplices:

The order of this operation matters since it affects the orientation of the new simplex.

**Exercise.** Prove:
$$\sigma \cdot \tau = (-1)^{(\dim \sigma +1)(\dim \tau +1)}\tau \cdot \sigma.$$

**Definition.** The *join of two simplicial complexes* is defined as the set of all pairwise joins of the simplices (including the empty simplex) of the two complexes:
$$K \cdot L = \{ a \cdot b:\ a\in K, b\in L\}.$$

**Exercise.** Prove that $K \cdot L$ is a simplicial complex.

**Exercise.** Represent these simplicial complexes as joins:

- the segment, the disk, the $n$-ball;
- the circle, the sphere, the $n$-sphere.

**Exercise.** Describe the geometric realization of the join by expressing $|K\cdot L|$ in terms of $|K|,|L|$.

## Chains in products

How does taking products of complexes affect the homology? Is the homology group of the product equal to the product of the homology groups? It is seemingly confirmed by this example: $$H_1({\bf T}^2) = H_1({\bf S}^1 \times {\bf S}^1) \cong H_1({\bf S}^1) \times H_1({\bf S}^1).$$

We need to understand what happens to the chain groups first.

According to the last subsection, the *cells* of $K$ are “cross-multiplied” with those of $L$:

- an $i$-cell in $K$ and a $j$-cell in $L$ are combined to create an $(i+j)$-cell in $K \times L$.

As linear combinations of cells, the *chains* of $K$ are also "“cross-multiplied” with those of $L$:

- an $i$-chain in $K$ and a $j$-chain in $L$ are combined to create an $(i+j)$-chain in $K \times L$.

Consequently, we won't try to compute the $k$th chain groups $C_k(K\times L)$ of the product from the chain groups $C_k(K)$ and $C_k(L)$ of the *same* degree. Instead, we'll look at the *complementary* dimensions. In other words,

- $C_k(K\times L)$ is found from the pair-wise products of the elements of $C_i(K)$ and the elements of $C_{j}(L)$ for all $i+j=k$.

Taken together, these correspondences create a map called the *cross product*:
$${\Large \times} :C_i(K) \times C_{j}(L) \to C_{i+j}(K\times L),$$
given by
$$(x,y)\mapsto x\times y.$$

**Proposition.** The cross product is

*bilinear*, i.e., linear on either of the two arguments; and*natural*, i.e., for any cubical maps $f:K\to K',\ g:L\to L'$, we have

$$(f,g)_{\Delta}(a\times b)=f_{\Delta}(a) \times g_{\Delta}(b).$$

The same idea will later apply to homology: $k$th homology $H_k(K\times L)$ is expressed as the pair-wise combinations of $H_k(K)$ and $H_{i-k}(L)$ for all $i$ (the Kunneth formula).

**Example (torus).** Let's consider the torus.

The torus itself, a $2$-manifold, is the product of the two circles, $1$-manifolds, and their $1$-homology groups are generated by the $1$-cycles, the fundamental classes, $a$ and $b$. Now, we conjecture that the $2$-cycle $\tau$ of the void of the torus, the fundamental class of this $2$-manifold, is constructed as the product of these two $1$-classes $a,b$.

This reasoning applies to the other homology classes of the torus. The longitude (or the latitude) of the torus, a $1$-manifold, is the product of a circle, $1$-manifold, and a point, $0$-manifold. We conjecture that the $1$-cycle represented by the longitude (latitude) is constructed as the product of the $1$-cycle of the first circle and the $0$-cycle of the second circle. Therefore, the identity we started with is just a coincidence!

Let's work out this example algebraically...

We list the chain groups of these two complexes and their generators: $$\begin{array}{lll} C_0(K)=< A_1,A_2,A_3,A_4 >, &C_1(K)=< a_1,a_2,a_3,a_4 >;\\ C_0(L)=< B_1,B_2,B_3,B_4 >, &C_1(L)=< b_1,b_2,b_3,b_4 >. \end{array}$$ Then for the product $$M=K\times L,$$ we have: $$\begin{array}{lll} C_0(M) = < A_i\times B_j :i,j=1,2,3,4 >;\\ C_1(M) = < A_i\times b_j, a_i\times B_j :i,j=1,2,3,4 >;\\ C_2(M) = < a_i\times b_j :i,j=1,2,3,4 >. \end{array}$$ $\square$

As we know, the boundary operator of a cubical complex can be defined on the cubes as products of cubes of lower dimension according to this Leibniz-type formula: $$\partial ^M(a \times b) = \partial ^K a \times b + (-1)^{\dim a}a \times \partial ^L b.$$ This formula can be extended to the chains and can serve as the boundary operator $$\partial^M: C_k(M) \to C_{k-1}(M)$$ of the product complex $M:=K \times L$. We also know that $$\partial^M\partial^M =0.$$

**Example (torus).** Let's compute the boundary operator for the torus $M$:
$$\begin{array}{lll}
\partial (A_i\times B_j) &= \partial A_i \times B_j + A_i \times \partial B_j =0;\\
\partial (A_i\times b_j) &= \partial A_i \times b_j + A_i \times \partial b_j =A_i \times \partial b_j;\\
\partial (a_i\times B_j) &= \partial a_i \times B_j + a_i \times \partial B_j =\partial a_i \times B_j;\\
\partial (a_i\times b_j) &= \partial a_i \times b_j + a_i \times \partial b_j .
\end{array}$$

What we see is this: $$\ker \partial ^M = \ker \partial ^L \times \ker \partial ^L ,$$ $$\operatorname{Im} \partial ^M = \operatorname{Im} \partial ^L \times \operatorname{Im} \partial ^L .$$ $\square$

**Exercise.** Confirm these identities for the torus example.

**Exercise.** Consider the projections:
$$p_K: K \times L \to K,\ p_L: K \times L \to L,$$
and find their chain maps:
$$C(K \times L) \to C(K) ,\ C(K \times L) \to C(L).$$

Under products, the chains are easy to handle but, as we shall see, the homology classes aren't...

There is an analog of the Leibniz-like formula above for the joins.

**Proposition.** For two simplicial complexes $\{K,\partial ^K\},\ \{L,\partial ^L\}$, the boundary operator of their join $k\cdot L$ is given by:
$$\partial ^M(a \cdot b) = \partial ^K a \cdot b + (-1)^{n+1}a \cdot \partial ^L b.$$

**Exercise.** Prove the proposition.

## The Universe: 3-sphere, 3-torus, or something else?

Our everyday experience suggests that the universe is $3$-dimensional and, probably, a manifold. In addition, we will assume that this manifold is orientable, compact, with empty boundary.

But first let's try to understand the “lower-dimensional universes”...

If the universe is $1$-dimensional, it's a topological circle. To understand how it feels to live in such a world, we thicken this circle: $${\bf S}^1\times {\bf I} \times {\bf I}.$$ We still assume that the light propagates strictly parallel to this circle. Then the light makes round trips around this circle and this is what an observer would see:

If the universe is $2$-dimensional, there are many choices. Once again, we thicken this surface to be able to fit into it: $${\bf M}^2\times {\bf I}.$$ We still assume that the light propagates along $M$.

The simplest choice for $M$ is the sphere ${\bf S}^2$. Here the light comes back no matter in what direction it leaves. This is what an observer would see:

If there is nothing else in the universe, you'd see your own, panoramic image stretched around the walls with the back of your head in front of you.

The second simplest choice for $M$ is the torus ${\bf T}^2$. Here the light goes through the wall on the left and reappears from the right. Or it goes through the wall in front and reappears from the back. This is what an observer would see:

If there is nothing else in the universe, you'd see your own image in all four directions repeated infinitely many times. Through these observations, we recognize the product of two $1$-dimensional universes seen as the two enfilades of rooms.

The triple torus will have the same affect with three pairs of walls having this property, etc.

What about the real, $3$-dimensional universe?

There are even more choices. The $3$-sphere will be similar to the $2$-sphere. One will see the 3d image of himself stretched over all walls and the ceiling and the floor.

The next option is the one we have seen before: the box with the opposite faces identified:

It is similar to the case of ${\bf T}^2$ with one's image repeated in six directions instead of four. And, one will see three enfilades of rooms instead of two. This suggests that the space is the product of three copies of the circle, which is the $3$-dimensional analogue of the torus: $${\bf T}^3={\bf S}^1 \times {\bf S}^1 \times{\bf S}^1 .$$

**Exercise.** Prove this statement.

Even though these two spaces *look* different from the inside but how do we know that they *are* different? Just consider the homology groups:
$$\begin{array}{llllllllllll}
H({\bf S}^3) &=\{{\bf Z}, &0, &0, &{\bf Z}, &0,...\},\\
H({\bf T}^3) &=\{{\bf Z}, &{\bf Z}\oplus{\bf Z}\oplus{\bf Z}, &{\bf Z}\oplus{\bf Z}\oplus{\bf Z}, &{\bf Z}, &0,...\}
\end{array}$$

**Exercise.** What if the universe is built from a tetrahedron?

This is how ${\bf R}^3$ can be decomposed into infinitely many tori:

**Exercise.** Consider the universe made of two solid tori glued to each other along their boundaries. What is it homeomorphic to?

Another candidate for the model of the universe is the *Poincare homology sphere*. It is called this because, even though it is not homeomorphic to the $3$-sphere, their homology groups coincide. This space is built from a regular dodecahedron by gluing together its opposite faces, $6$ pairs of pentagons:

Since each face has to be turned $32$ degrees for gluing, this is what an observer would see:

Now, does the universe have to be orientable? Can one come back as a mirror image of himself?

**Exercise.** Devise a universe where such a trip is possible.

**Exercise.** From these examples of molecules, choose ones that would survive intact such an around the (non-orientable) world trip:

## Configuration spaces

Suppose we have a robotic arm with a single joint, i.e., just a *rotating rod*. What is the set of all possible positions of its end? It's the circle ${\bf S}^1$ parametrized in the standard way:
$$x=R\cos \phi,\ y=0, z=R\sin\phi,$$
where $\phi$ is the angle of the rod and $R$ is the (fixed) length of the rod.

Next, if this is a *ball joint* the set of possibilities is the sphere ${\bf S}^2$.

Further, the set of possible positions of a *two-joint* arm is more complicated:

The locations can be parametrized, but not uniquely, by $$x=R_1\cos \phi_1 + R_2 \cos\phi_2,\ y=0,\ z=R_1\sin \phi_1 + R_2 \sin\phi_2,$$ where $\phi_1,\phi_2$ are the angles of the two arms and $R_1,R_2$ are the lengths of the arms.

**Exercise.** Prove that it's either the disk or the annulus.

Meanwhile, the space of *states* of the arm is simply the set of all possible values of the angles $\phi_1,\phi_2$. That's the torus:
$${\bf T}^2={\bf S}^1 \times {\bf S}^1 .$$

This is the reason why we separate the two:

- the
*operational space*as the set of all positions reachable by a robot's end in space, and - the
*configuration space*as the set of all possible combinations of the positions of the joints.

For the two-joint arm, we could make these two homeomorphic to each other if we make the axes of the joints (first red, second green) perpendicular to each other with the latter shorter than the former:

This can't happen with $3$ or more joints as there will be $n\ge 3$ parameters. The configuration space of an $n$-joint arm (called also a "linkage") is the $n$-torus: $${\bf T}^n={\bf S}^1 \times ... \times {\bf S}^1 ,$$ which won't fit (can't be embedded) into our $3$-dimensional Euclidean space.

In general, such a setup might be much more complicated:

**Exercise.** Find the configuration space of this robotic arm. Hint: Don't forget the fingers!

So, the positions of the joints are used as the parameters of the state of the robot.

If the motion of the joints is independent (and we ignore the possible self-intersections), *the configuration space is the product of the configuration spaces of the joints*.

For example, a telescoping arm with a joint has the cylinder, $C=[0,1] \times {\bf S}^1$, as the configuration space.

Furthermore, even if the arm has delivered the end of the arm to its intended *location*, the task at hand, such as spray-painting, may require a particular *direction*:

This would add an extra term $\times {\bf S}^2$.

These examples suggest that configuration spaces should be manifolds.

**Exercise.** What is the configuration space of a pendulum attached to a spring?

**Exercise.** What is the configuration space of the three-joint arm the end of which is fixed? Hint: The answer will depend on the relative lengths of the rods.

The robot's *forward and inverse kinematics equations* define functions from its configuration space to the operational space. These functions are then used for motion planning.

In physics, an example of a configuration space is *the state space of $n$ particles*. In the simplest setting, it's
$$C=({\bf R}^3)^n.$$
Every configuration corresponds to a single point in this space.

The answer changes, however, if we drop the implicit assumption that the particles are *distinguishable*. In that case, the configuration space is
$$C=({\bf R}^3)^n/_{\sim},$$
where $\sim$ is an equivalence relation derived from the identification of each pair of particles.

**Exercise.** What is the configuration space of two and three particle systems with identical particles?

Another issue is, can two particles occupy the same location? If the answer is No, we have to exclude the diagonals from the configuration space: $$C=({\bf R}^3)^n \setminus D,$$ where $$D=\{(u_1,...,u_n)\in ({\bf R}^3)^n:\ \exists i \ne j,u_i = u_j\}.$$

This idea is used for “collision avoidance” in robotics by constructing such a “safe” configuration space. On a line segment, this configuration space of three robots will be a cube with three planes cut out:

**Exercise.** How many path-connected components does this space have?

If two can pass each other (but not three), the configuration space is a cube with the diagonal drilled out:

It is connected!

**Exercise.** Find the safe configuration space for two robots on a line, the circle, the tripod. What if they are connected by a rod? a rope?

In general, if the motion is within manifold $M$ of dimension $m$, then the configuration space is also a manifold but of dimension $mn$.

If molecular *bonds* are present, things are more complicated. Suppose, for example, we have two atoms in space. Then their configuration space is $({\bf R}^3)^2$. Now suppose this is a two-atom molecule with two (different) atoms. Then its state is described, independently, by

- the location of its center of mass (or either of the two atoms as a point of reference) and
- the orientation of one of the atoms with respect to that point (the distance is fixed).

Therefore, the configuration space is ${\bf R}^3 \times {\bf S}^2$.

**Exercise.** What if this is a two-atom molecule with two *identical* atoms?

One can also take into account the velocity (or the momentum) of each particle moving within manifold $M$. These velocities combined give us what we call the tangent bundle $TM$ of $M$. The totality of all possible positions and velocities is called the *phase space* of the system.

**Exercise.** Show that the phase space of a pendulum is the cylinder.

## Homology of products: the Künneth formula

What happens to the homology when we take the product of two, in the simplest case cubical, complexes? In other words, given two complexes $K$ and $L$, express $H(K \times L)$ entirely in terms of $H(K)$ and $H(L)$.

It would be naive to assume that the answer is “it's the product of the homology groups”, because we have already seen that taking the direct sums of the homology groups of the same dimension doesn't produce the desired results: $$H_2({\bf T}^2)=H_2({\bf S}^1 \times {\bf S}^1) = {\bf R} \ne H_2({\bf S}^1) \oplus H_2({\bf S}^1) = 0 \oplus 0 = 0.$$

As discussed above, we need instead to look at the *complementary* dimensions. After all, the product of an $n$-cube in $K$ and an $m$-cube in $L$ in an $(n+m)$-cube in $K\times L$. For example, to find all $2$-cubes in $K\times L$ one has to look at all the products of $1$-cubes in $K$ with $1$-cubes in $L$ as well as $0$- with $2$-cubes. More generally, we look at the algebraic decompositions of the dimension $k$ of the cubes we are considering:

- $k=1=1+0=0+1$;
- $k=2=2+0=1+1=0+2$;
- $k=3=3+0=2+1=1+2=0+3$,
- ...
- $k=k+0=(k-1)+1=(k-2)+2=...=0+k$.

There are exactly $k+1$ such decompositions of $k$.

This idea of decomposing the dimension $k$ applies to the chains next. Each decomposition of $k$ corresponds to a component of $C_k(K\times L)$: $$C_k(K) \ \& \ C_{0}(L),\quad C_{k-1}(K) \ \& \ C_{1}(L),\quad ... \quad, C_0(K) \ \& \ C_{k}(L).$$ It follows then that the $k$-homology is the sum of the combinations: $$H_k(K) \ \& \ H_{0}(L),\quad H_{k-1}(K) \ \& \ H_{1}(L),\quad ... \quad, H_0(K) \ \& \ H_{k}(L).$$ However, how exactly do we combine each of these pairs, $$V=H_i(K),\ W=H_j(L),\ i+j=k?$$ After all, we know that the product won't work.

We provide the definition of this operation for two arbitrary vector spaces $V$ and $W$ over field $R$.

First, we consider the *product set* $V \times W$ of the vector spaces as sets (rather than vector spaces), so that it consists of all pairs $(v, w)$ with $v\in V$ and $w\in W$.

Second, we define the *free vector space* $< V \times W >$ of this space as the vector space of all formal linear combinations of the elements of $V \times W$. In other words, $V \times W$ serves as its basis because the relations between the elements of $V,W$ are lost in the new vector space.

Third, we consider a certain *quotient vector space* of $< V \times W >$ as follows.

We consider the subspace $Z$ of $< V \times W >$ generated by the following elements: $$\begin{array}{llllll} (v_1, w) + (v_2, w) - (v_1 + v_2, w),\\ (v, w_1) + (v, w_2) - (v, w_1 + w_2),\\ c \cdot (v, w) - (cv, w), \\ c \cdot (v, w) - (v, cw), \end{array}$$ where $$v, v_1,v_2\in V,\ w, w_1,w_2\in W,\ c\in R.$$ Note: To simplify notation, we drop the coefficient if it is equal to $1$, i.e., $(v, w)$ stands for $1 \cdot (v, w) \in < V \times W >$.

Then the *tensor product* of $V$ and $W$ is defined to be
$$V \otimes W := < V \times W > / Z.$$

Also, the tensor product of two vectors $v \in V$ and $w \in W $ is the equivalence class](coset) of $(v,w)$:
$$v \otimes w:=(v,w) + Z \in V\otimes W.$$
The elements of this form are called *elementary tensors*.

**Exercise.** Prove that, if $B_V,B_W$ are bases of $V,W$, then
$$\{v \otimes w:\ v\in B_V,\ w\in B_W\}$$
is a basis of $V \otimes W$. Hint: Not all of the elements of the tensor product are elementary tensors.

It follows that these equations hold in $V\otimes W$: $$\begin{array}{llllll} (v_1 + v_2) \otimes w &= v_1 \otimes w + v_2 \otimes w;\\ v \otimes (w_1 + w_2) &= v \otimes w_1 + v \otimes w_2;\\ cv \otimes w &= v \otimes cw = c(v \otimes w). \end{array}$$

**Exercise.** Prove that $R \otimes V=V$ over field $R$.

Our main result is stated below without proof (see Bredon, *Topology and Geometry*, p. 320).

**Theorem (Künneth Formula).** For two cell complexes $K,L$, the homology over ${\bf R}$ is given by:
$$H_k(K\times L) \cong \bigoplus _i H_k(K) \otimes H_{i-k}(L).$$

For the computations we will need, we'll reply only on these two main properties of the tensor product:

**Proposition.**

- ${\bf R} \otimes {\bf R} = {\bf R},$
- ${\bf R} \otimes 0 = {\bf R}.$

**Example.** Let's consider the torus and compute the sum of all tensor products of the homology groups of dimensions that add up to $k$, for each $k$.

First, the ones that add up to $0$: $$\begin{array}{llllll} H_0({\bf T}^2) &=H_0({\bf S}^1 \times {\bf S}^1)\\ &= H_0({\bf S}^1) \otimes H_0({\bf S}^1)\\ &= {\bf R} \otimes {\bf R} = {\bf R} . \end{array}$$ Those that add up to $1$: $$\begin{array}{llllll} H_1({\bf T}^2) &=H_1({\bf S}^1 \times {\bf S}^1) \\ &= H_1({\bf S}^1) \otimes H_0({\bf S}^1) \ \oplus \ H_0({\bf S}^1) \otimes H_1({\bf S}^1) \\ &= {\bf R} \otimes {\bf R} \quad\oplus\quad {\bf R} \otimes{\bf R} \\ &= {\bf R} \quad\oplus\quad {\bf R} = {\bf R}^2 . \end{array}$$ Those that add up to $2$: $$\begin{array}{llllll} H_2({\bf T}^2) &=H_2({\bf S}^1 \times {\bf S}^1) \\ &= H_2({\bf S}^1) \otimes H_0({\bf S}^1) \ \oplus \ H_1({\bf S}^1) \otimes H_1({\bf S}^1) \ \oplus \ H_0({\bf S}^1) \otimes H_2({\bf S}^1)\\ &= 0 \otimes{\bf R} \quad\oplus\quad {\bf R} \otimes {\bf R} \quad\oplus\quad {\bf R} \otimes 0 \\ &= 0 \quad\oplus\quad {\bf R} \quad\oplus\quad 0 = {\bf R} . \end{array}$$

The results match our previous computations.

$\square$

**Exercise.** Use the formula to compute the homology of the $3$-torus.

**Exercise.** Sometimes the “naive product formula” does hold. Derive it from the theorem for $1$-homology:
$$H_1(K\times L) \cong H_1(K) \oplus H_1(L).$$

The tensor product is also defined for two modules over any ring $R$. Then the things are made (even) more complicated by the presence of the torsion. Fortunately, $H_p(K\times L)$ is only affected by the torsion of $H_k(K),H_k(L)$ for $k<p$. The result is the following generalization of the last formula.

**Theorem (Naive Product Formula).** If complexes $K,L$ are path-connected and, for integral homology, we have
$$H_k(K)=H_k(L)=0,\ k=1,2,...,p-1,$$
then
$$H_p(K \times L) \cong H_p(K) \oplus H_p(L).$$

The isomorphism $x\times y \mapsto (x,y)$ is natural.

**Exercise.** Prove that
$$H_1({\bf T}^n) = {\bf Z}^n.$$

**Exercise.** Prove that
$$H_p \left( \left( {\bf S}^p \right)^n \right) = {\bf Z}^n.$$