What does chaos do to information? When organization dissolves into static, is the information lost? Can it be recovered? We'll explore these questions through Arnold's cat map, a chaotic mapping rooted within the study of dynamical systems.

Arnold's cat map is the transformation defined by the matrix \(A_{cat} = \begin{bmatrix} 1 & 1 \\ 1 & 2 \\ \end{bmatrix}\), the result of which is processed modulo \(1\). The mapping was first explored in the 1960's by the Russian mathematician Vladimir Arnold who studied the transformation's effects on a picture of a cat, hence the mapping's whimsical name. Arnold's cat map is a mapping from the two-dimensional torus to itself, but Arnold demonstrated that this toroidal automorphism can be applied to a two-dimensional picture by considering the image as being wrapped over a two-dimensional torus.

Given a square image, each pixel can be treated as a position vector, and the modulo is taken to be the length of any side of the image. This is equivalent to considering the image as a square of area \(1\) whose pixels are located at positions \((0,0) \leq (x,y) \leq (1,1)\). Arnold's cat map will take each pixel and move it to a different location, or different position vector, under the transformation:

\(\begin{bmatrix} x_{n-1} \\ y_{n-1} \\ \end{bmatrix} = \begin{bmatrix} 1 & 1 \\ 1 & 2 \\ \end{bmatrix} \begin{bmatrix} x_{n} \\ y_{n} \\ \end{bmatrix} \space mod \space n\), where \(n\) is the length of any side of the image

The determinant of \(A_{cat}\) is \(\begin{vmatrix} 1 & 1 \\ 1 & 2 \\ \end{vmatrix} = (1)(2) - (1)(1) = 2 - 1 = 1\). A determinant of \(1\) means that the transformation induced by \(A_{cat}\) is area-preserving. So given an image of area \(n\), the transformation will produce another image of area \(n\) which will be a rearrangement of the individual pixels of the original image.

Arnold's cat map has the interesting effect of displaying simultaneous order and chaos. Under iterated transformations, an image is distorted into an apparent static, but through successive applications of the mapping the image is eventually reproduced. Throughout the iterations, "ghost" images of the original image sometimes appear, often inverted or as multiple tiled images of the original picture. In the below example, a square \(100\space x\space 100\) pixel image is processed through Arnold's cat map until the original image is eventually recovered. The number below each image signifies the number of iterations of the mapping which were needed to generate that image.

Arnold's cat map is the transformation defined by the matrix \(A_{cat} = \begin{bmatrix} 1 & 1 \\ 1 & 2 \\ \end{bmatrix}\), the result of which is processed modulo \(1\). The mapping was first explored in the 1960's by the Russian mathematician Vladimir Arnold who studied the transformation's effects on a picture of a cat, hence the mapping's whimsical name. Arnold's cat map is a mapping from the two-dimensional torus to itself, but Arnold demonstrated that this toroidal automorphism can be applied to a two-dimensional picture by considering the image as being wrapped over a two-dimensional torus.

Given a square image, each pixel can be treated as a position vector, and the modulo is taken to be the length of any side of the image. This is equivalent to considering the image as a square of area \(1\) whose pixels are located at positions \((0,0) \leq (x,y) \leq (1,1)\). Arnold's cat map will take each pixel and move it to a different location, or different position vector, under the transformation:

The determinant of \(A_{cat}\) is \(\begin{vmatrix} 1 & 1 \\ 1 & 2 \\ \end{vmatrix} = (1)(2) - (1)(1) = 2 - 1 = 1\). A determinant of \(1\) means that the transformation induced by \(A_{cat}\) is area-preserving. So given an image of area \(n\), the transformation will produce another image of area \(n\) which will be a rearrangement of the individual pixels of the original image.

Arnold's cat map has the interesting effect of displaying simultaneous order and chaos. Under iterated transformations, an image is distorted into an apparent static, but through successive applications of the mapping the image is eventually reproduced. Throughout the iterations, "ghost" images of the original image sometimes appear, often inverted or as multiple tiled images of the original picture. In the below example, a square \(100\space x\space 100\) pixel image is processed through Arnold's cat map until the original image is eventually recovered. The number below each image signifies the number of iterations of the mapping which were needed to generate that image.

The following are a few select iterations of a larger version (\(162\space x\space 162\) pixels) of the same cat image in order to show the detail of the potential patterns and ghost images. While the same image had an iterative period of \(60\) when sized at \(100\space x\space 100\) pixels, the \(162\space x\space 162\) pixel version has an iterative period of \(216\).

The following are a few select iterations of a still larger version (\(220\space x\space 220\) pixels) of the cat image in order to show a few patterns and ghost images. This \(220\space x\space 220\) pixel version has an iterative period of \(24\).

One might expect that the larger the image size, the longer the iterative period. However, although the period increased from \(60\) for the \(100\space x\space 100\) image to \(216\) for the \(162\space x\space 162\) image, the period length dropped dramatically down to \(24\) for the \(220\space x\space 220\) image. In fact, although the topic has been explore, there is no known formula to determine the Arnold's cat map period of an image based upon its size or number of pixels.

There are other ways in which we can explore Arnold's cat map. We can use the eigenvalues and the eigenvectors of the matrix \(A_{cat}\) to understand precisely how the mapping transforms the image during each iteration.

To find the eigenvalues of \(A_{cat}\):

$$ \begin{bmatrix} 1 & 1 \\ 1 & 2 \\ \end{bmatrix} \rightarrow \begin{bmatrix} 1-\lambda & 1 \\ 1 & 2-\lambda \\ \end{bmatrix} $$

$$ (1 - \lambda )(2 - \lambda ) - 1 = 0 $$

$$ \lambda^{2} - 3\lambda + 1 = 0 $$

$$ \lambda = \frac{-(-3) \pm \sqrt{(-3)^{2} - 4(1)(1)}}{2(1)} = \frac{3 \pm \sqrt{5}}{2} $$

So, there are two eigenvalues for \(A_{cat}\):

$$ \lambda_1 = \frac{3 + \sqrt{5}}{2} \space \space \space \space \space \space \space \lambda_2 = \frac{3 - \sqrt{5}}{2} $$

To find the eigenvectors of \(A_{cat}\):

$$
(A_{cat}-\lambda_{1}I)\vec{x_{1}} = \vec{0}
$$

$$ \begin{bmatrix} 1-\lambda_1 & 1 \\ 1 & 2-\lambda_1 \\ \end{bmatrix} \begin{bmatrix} x_{1} \\ y_{1} \\ \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ \end{bmatrix} $$

$$ \begin{bmatrix} 1-\frac{3 + \sqrt{5}}{2} & 1 \\ 1 & 2-\frac{3 + \sqrt{5}}{2} \\ \end{bmatrix} \begin{bmatrix} x_{1} \\ y_{1} \\ \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ \end{bmatrix} $$

$$ (1-\frac{3 + \sqrt{5}}{2})x_{1} + (1)y_{1} = 0 \rightarrow (\frac{-1 - \sqrt{5}}{2})x_{1} + y_{1} = 0 $$

$$ y_{1} = (\frac{1 + \sqrt{5}}{2})x_{1} $$

So, for \(\lambda_{1}\), the associated eigenvector is:

\begin{bmatrix} 1 \\ \frac{1 + \sqrt{5}}{2} \\ \end{bmatrix}

$$ \begin{bmatrix} 1-\lambda_1 & 1 \\ 1 & 2-\lambda_1 \\ \end{bmatrix} \begin{bmatrix} x_{1} \\ y_{1} \\ \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ \end{bmatrix} $$

$$ \begin{bmatrix} 1-\frac{3 + \sqrt{5}}{2} & 1 \\ 1 & 2-\frac{3 + \sqrt{5}}{2} \\ \end{bmatrix} \begin{bmatrix} x_{1} \\ y_{1} \\ \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ \end{bmatrix} $$

$$ (1-\frac{3 + \sqrt{5}}{2})x_{1} + (1)y_{1} = 0 \rightarrow (\frac{-1 - \sqrt{5}}{2})x_{1} + y_{1} = 0 $$

$$ y_{1} = (\frac{1 + \sqrt{5}}{2})x_{1} $$

\begin{bmatrix} 1 \\ \frac{1 + \sqrt{5}}{2} \\ \end{bmatrix}

$$
(A_{cat}-\lambda_{2}I)\vec{x_{2}} = \vec{0}
$$

$$ \begin{bmatrix} 1-\lambda_2 & 1 \\ 1 & 2-\lambda_2 \\ \end{bmatrix} \begin{bmatrix} x_{2} \\ y_{2} \\ \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ \end{bmatrix} $$

$$ \begin{bmatrix} 1-\frac{3 - \sqrt{5}}{2} & 1 \\ 1 & 2-\frac{3 - \sqrt{5}}{2} \\ \end{bmatrix} \begin{bmatrix} x_{2} \\ y_{2} \\ \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ \end{bmatrix} $$

$$ (1-\frac{3 - \sqrt{5}}{2})x_{2} + (1)y_{2} = 0 \rightarrow (\frac{-1 + \sqrt{5}}{2})x_{2} + y_{2} = 0 $$

$$ y_{2} = -(\frac{1 + \sqrt{5}}{2})^{-1}x_{2} $$

So, for \(\lambda_{2}\), the associated eigenvector is:

\begin{bmatrix} 1 \\ -(\frac{1 + \sqrt{5}}{2})^{-1} \\ \end{bmatrix}

$$ \begin{bmatrix} 1-\lambda_2 & 1 \\ 1 & 2-\lambda_2 \\ \end{bmatrix} \begin{bmatrix} x_{2} \\ y_{2} \\ \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ \end{bmatrix} $$

$$ \begin{bmatrix} 1-\frac{3 - \sqrt{5}}{2} & 1 \\ 1 & 2-\frac{3 - \sqrt{5}}{2} \\ \end{bmatrix} \begin{bmatrix} x_{2} \\ y_{2} \\ \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ \end{bmatrix} $$

$$ (1-\frac{3 - \sqrt{5}}{2})x_{2} + (1)y_{2} = 0 \rightarrow (\frac{-1 + \sqrt{5}}{2})x_{2} + y_{2} = 0 $$

$$ y_{2} = -(\frac{1 + \sqrt{5}}{2})^{-1}x_{2} $$

\begin{bmatrix} 1 \\ -(\frac{1 + \sqrt{5}}{2})^{-1} \\ \end{bmatrix}

The fraction \(\frac{1 + \sqrt{5}}{2}\) is the golden ratio, \(\varphi\).

So, the two eigenvalues are \(1 + \varphi\) and \(2 - \varphi\), with eigenvetors \(\begin{bmatrix} 1 \\ \varphi \\ \end{bmatrix}\) and \(\begin{bmatrix} 1 \\ -\varphi^{-1} \\ \end{bmatrix}\) respetively.

The eigenvalues represent the amount by which the image is stretched in the direction of each associated eigenvector. So the image is stetched in the direction of \(\begin{bmatrix} 1 \\ \varphi \\ \end{bmatrix}\) by a factor of \(1 + \varphi\), and it is compressed in the direction of \(\begin{bmatrix} 1 \\ -\varphi^{-1} \\ \end{bmatrix}\) by a factor of \(2 - \varphi\). The streaks that appear in the image appear in the directions of the two eigenvectors.

Finally, the modulo \(n\) aspect of the transformation brings the entire area of the new, deformed image back into the square bounds of the original image. Let the length of each square below be \(n\) units. Under the \(A_{cat}\) transformation, the image is stretched such that the yellow, blue, and green sections are outside of the original square. But after the modulo function is considered, a pixel's location in both the x and y directions becomes the remainder of its x-coordinate under the \(A_{cat}\) transformation divided by \(n\) and the remainder of its y-coordinate under the \(A_{cat}\) transformation divided by \(n\), respectively. This results in the yellow, blue, and green pieces being pulled back into the original square.

So, the two eigenvalues are \(1 + \varphi\) and \(2 - \varphi\), with eigenvetors \(\begin{bmatrix} 1 \\ \varphi \\ \end{bmatrix}\) and \(\begin{bmatrix} 1 \\ -\varphi^{-1} \\ \end{bmatrix}\) respetively.

The eigenvalues represent the amount by which the image is stretched in the direction of each associated eigenvector. So the image is stetched in the direction of \(\begin{bmatrix} 1 \\ \varphi \\ \end{bmatrix}\) by a factor of \(1 + \varphi\), and it is compressed in the direction of \(\begin{bmatrix} 1 \\ -\varphi^{-1} \\ \end{bmatrix}\) by a factor of \(2 - \varphi\). The streaks that appear in the image appear in the directions of the two eigenvectors.

Finally, the modulo \(n\) aspect of the transformation brings the entire area of the new, deformed image back into the square bounds of the original image. Let the length of each square below be \(n\) units. Under the \(A_{cat}\) transformation, the image is stretched such that the yellow, blue, and green sections are outside of the original square. But after the modulo function is considered, a pixel's location in both the x and y directions becomes the remainder of its x-coordinate under the \(A_{cat}\) transformation divided by \(n\) and the remainder of its y-coordinate under the \(A_{cat}\) transformation divided by \(n\), respectively. This results in the yellow, blue, and green pieces being pulled back into the original square.

- - - - - - - - - -

What is information, and what does it look like? Does disorder always contain deep-rooted information, and is it just a matter of looking in a different way to find it?

Some of the images produced by Arnold's cat map feature multiple "ghost images" of the original cat. During the transformations, no pixels are added or removed from the image; they are simply rearranged. The two sets of pink ears in image \(30\) of the \(100\space x\space 100\) pixel set are made entirely from the pink pixels in the single set of ears in the original image. The same is true for the \(9\) sets of ears in image \(72\) of the \(162\space x\space 162\) pixel set and the \(441\) sets of ears in image \(12\) of the \(220\space x\space 220\) pixel set. In this way the transformation has, at times, a replicating effect, although the replications necessarily feature reduced resolution because of the fixed number of available pixels of each color.

What is information, and what does it look like? Does disorder always contain deep-rooted information, and is it just a matter of looking in a different way to find it?

Some of the images produced by Arnold's cat map feature multiple "ghost images" of the original cat. During the transformations, no pixels are added or removed from the image; they are simply rearranged. The two sets of pink ears in image \(30\) of the \(100\space x\space 100\) pixel set are made entirely from the pink pixels in the single set of ears in the original image. The same is true for the \(9\) sets of ears in image \(72\) of the \(162\space x\space 162\) pixel set and the \(441\) sets of ears in image \(12\) of the \(220\space x\space 220\) pixel set. In this way the transformation has, at times, a replicating effect, although the replications necessarily feature reduced resolution because of the fixed number of available pixels of each color.

The perfect restoration of the original image at the end of the iterative period speaks to a preservation of information throughout the series of transformations. These ghostly multiplied reconstructions are glimpses of that preservation amid the chaotic static of the mapping's period.

Notice that for each of the different image sizes, the first application of the transformation (in each case: image \(01\)) is the same. Arnold's cat map begins by transforming the images in an identical way, but the sets of images produced throughout an entire period vary greatly between the three sizes. Patterns appear in each image set which do not appear in the others. Consider the algorithm used to generate the images, and try to determine why the image sets look identical toward the beginning and end of their periords but deviate into unique patterns throughout the middle.

Notice that for each of the different image sizes, the first application of the transformation (in each case: image \(01\)) is the same. Arnold's cat map begins by transforming the images in an identical way, but the sets of images produced throughout an entire period vary greatly between the three sizes. Patterns appear in each image set which do not appear in the others. Consider the algorithm used to generate the images, and try to determine why the image sets look identical toward the beginning and end of their periords but deviate into unique patterns throughout the middle.

During periods of chaos in any given situation, how can we look for preserved information? When chaos takes over and you feel lost within the static of a situation, when your vision of the present or future becomes hazy, when you lose the context of the moment and your trajectory blurs, look for ghost images of the big picture. Perhaps they are fuzzy, distorted, or simply quite small, but they hint at a larger long-term vision which will eventually reemerge, sharp and clear.