Friday 29 December 2017

Escape Condition for Julia and Mandelbrot Fractals

This post is reproduced from the blog for my next book, Make Your Own Algorithmic Art.



This post isn't a tutorial on creating the Julia or Mandelbrot fractals - you can learn about that here. Here we'll focussed on a specific bit of mathematics about the "escape condition" that many guides state but fail to explain.



Basic Idea

The basic idea behind both Julia and Mandelbrot fractals is to take a starting complex number $z_0$, apply a simple function $z^2 + c$, and feed the output $z_1$ back into this function as input. Doing this repeatedly we get a sequence of output complex numbers, $z_1$, $z_2$, $z_3$ ...

$$ z_{n+1} = z_{n}^2 + c$$

These output values can behave in one of three ways:
  • They can get larger and larger, towards infinity. This is called escaping.
  • They can get smaller and smaller, towards zero.
  • They can orbit around, but without getting ever larger. In some cases they can approach a finite constant.

The fractal patterns are coloured to show which points escape, and which don't. Those that escape are outside the fractal, and those that don't are inside.


The difference between Julia and Mandelbrot fractals is only how we choose $z_0$ and $c$.


Computational Problem

The behaviour of complex numbers under $z^2 + c$ is chaotic. That is:
  • The output sequence is very often irregular, and predicting future values is very difficult without actually working out the sequence. They seem random, even if they're not.
  • The sequence is very sensitive to starting conditions. That is, even a tiny change in the starting conditions can drastically change how the sequence behaves.

We can't derive a mathematical formula which tells us which points are inside the fractal, and which are outside (they escape). And this chaotic behaviour suggests we can't truly know even if we run the feedback iterations many many times, because the sequence can suddenly escape after a long period of orbiting around.


Practical Compromise

So we have to compromise - we agree to generate a fixed number of output values, so we can render an approximate pattern. Perhaps a sequence of 50 output values is sufficient? Maybe 100? Maybe even 1000?

By experimenting, it becomes clear that 50 or 100 iterations is fine, except when we are zooming into a very small area of the fractals, where we need more iterations to be able to separate out the inside and outside regions. If we don't do this, the details of the fractal don't emerge.


Computational Shortcut

For many starting points in a fractal pattern, the output values will get larger and larger very quickly. We want to stop calculating the sequence for two reasons:
  • If the numbers get too large, this can cause our code to crash with an error, or worse, continue with incorrect calculations. The root cause of this is that the largest size of number we can store and calculate with is fixed.
  • We don't want to waste time and computational effort continuing to calculate the sequence which is only every going to get larger and larger. This shortcut can be important because the calculations for detailed fractals can take a long time.

So many guides will state that we can stop calculating the sequence when the magnitude $|z_n|$ gets larger than 2.


That works well, and makes intuitive sense. If $z_n$ has got so large that it is more than 2 away from the origin, then it will forever get larger and larger. But the statement is rarely explained.

Next is a simple proof - and also a demonstration that the popular escape condition $|z_n| > 2$ is incomplete.


Simple Proof

First let's remind ourselves of the triangle inequality, which basically says that a direct path is always shorter than an indirect path between two points.

$$ | a + b | \leq |a| + |b| $$

Let's contrive to artificially expand out $|z^2|$,

$$ |z^2| = |z^2 +c -c| $$

If we use that triangle inequality, with $a = z^2 +c$ and $b = -c$, we have

$$ |z^2 +c -c| \leq |z^2 +c| + |-c|  $$

but because $|-c|$ is just $|c|$ we have

$$ |z^2 +c -c| \leq |z^2 +c| + |c|  $$

Now, remember the next value in a sequence is $z^2 + c$ because that's the function we keep applying. Let's bring that to the left of that last inequality.

$$ |z^2 +c| \geq |z^2| - |c| $$

Also, $|z^2|$ is the same as $|z|^2$ so we have

$$ |z^2 +c| \geq |z|^2 - |c| $$

Now is the interesting part. If we say that $|z|$ is bigger than $|c|$, that would mean

$$ |z|^2 - |c| \gt |z|^2 - |z| $$

That means we can also say,

$$ |z^2 +c| \gt |z|^2 - |z| $$

Which can be factorised as

$$ |z^2 +c| \gt |z|(|z| -1) $$

Let's rewrite that previous expression as a ratio of the sizes of the current $z_n$ and the next $z_{n+1} = z_n^2 + c$,

$$ \frac {|z_{n+1}|} {|z_n|} \gt (|z| -1)$$

Now another interesting part. If we say $|z|$ is greater than 2, that means $|z| -1 \gt 1$. So we finally have

$$ \frac {|z_{n+1}|} {|z_n|} \gt 1$$

Which is saying that $|z_{n+1}|$ is always greater than $|z_n|$ as long as:

  • $|z| \gt |c|$, and 
  • $|z| \gt 2$

So we've shown that two conditions need to be true to prove that the sequences escapes, not just the traditional one. However in practice, the traditional $|z| \gt 2$ seems to work well enough.


Reference
One text that does try to cover this is Chaos and Fractals: New Frontiers of Science.