It’s no surprise that people who have never taken physics have heard of quantum mechanics (indeed, have heard phrases like “the [or Heisenberg’s] uncertainty principle” and “wave-particle duality), that people who’ve never written a line of computer code have heard of or even read books about artificial intelligence, and that in general scientific fields which many non-scientists are aware of are not at all representative of even those fields, let alone the sciences. More interesting, at least to me, is the number of people who don’t know what calculus is about let alone what differential equations are who have heard of things like fractals, strange attractors, and other topics from “chaos theory.”

It is much less surprising that many of the popular conceptions about such topics are wrong, right down to the basic nomenclature. Had the popular science author Gleick written his book at a different time and/or chosen different terminology for his extremely popular *Chaos Theory*, I think people would be referring to “catastrophe theory”, “dynamical systems”, “nonlinear science”, or some other term scientists have used to describe complex systems/complexity. True, “chaos theory” is much less misleading than “catastrophe theory” would probably be, but it is misleading nonetheless.

Things like chaotic systems and fractals are interesting because they ** are NOT** chaotic, in the sense that aren’t characterized by randomness and often not even unpredictably. In fact, fractals are about as far from randomness, chaos, and unpredictability as we can get. Although no universally accepted definition for fractals exists, there are some properties agreed to characterize them, one of the most fundamental being self-similarity. However, the relationships between fractals and “chaotic systems” are not straightforward, so I will take “chaos theory” here to refer to chaotic systems like weather.

If fractals can be at least largely characterized by self-similarity, is there any comparable property that characterizes chaotic systems? Sure: order. That’s right, the single most important property of chaotic systems is non-randomness, regularity, order, structure, and other terms that are almost opposite in meaning to “chaos”. When you think about it a bit, this makes perfect sense. We generally thing of chaos as describing things that have no order, that are random, that lack organization. Systems that are random are really, really, boring. You can simulate the dynamics of many complex systems very easily by flipping a coin (WARNING! Do NOT try this at home- go to a friend’s house or do it while driving for safety reasons). A (reasonably) fair coin behaves chaotically in the more usual sense: the outcome of each flip is entirely independent from the previous and each outcome is entirely random.

Let’s unpack that a bit. Another way of saying both of the above together is that the only “rule” that determines the system’s outcomes is probabilistic (more specifically, every flip is a Bernoulli trial). However, I broke this statement into two for a reason. There are all kinds of ways in which things can be called “random” (and the word means different things even in probability theory, let alone the sciences). One kind of random process is something called a Markov process with something called a Markov property. Simply put, this describes a system in which the outcome states are basically like coin flips:

1) The system is discrete (the “states”, like coin flips, are like “units” or clearly individual outcomes as opposed to the state of continuous systems like a an arrow shot into the air or a coin dropped from a building which are constantly changing over time),

2) The system is random (I’m not even going to attempt a satisfactory definition of “random” here)

3) Given the system’s state at a particular time (or step, trial, outcome, whatever) doesn’t depend upon the system’s past state**s**.

The difference between a system that “acts” like coin flips (Bernoulli trials) and those stochastic systems with the Markov property all depends upon adding an “s”: a system with dynamics like that of coin tosses has future states that are independent of any other state, while those with the Markov property have future states that depend on the present state. This distinction is so important and has such extraordinary consequences that it is hard to overestimate. For example, the only thing you need to understand any system which acts like our flipped coin is a probability distribution (the Bernoulli distribution, or in more general cases the binomial distribution). For this seemingly innocent system with the Markov property, we need to introduce things liked probability vectors, stochastic matrices, difference equations, and possibly more (e.g., finite-state machines). All due to the fact that the next state of the system is conditional ONLY upon the present, rather than independent of present and past states (like the flipped coin) or dependent upon past states and initial conditions (most systems, including “chaotic”).

Notice that this surprisingly qualitative change in complexity comes from making a random process not “purely” random. A single condition is added, and we get an entirely different picture. Of course, it is possible to have boring systems that are characterized by order, structure, predictability, etc., too. Computers, for example, are really boring as systems. We built them to do exactly what we want them too, so we compartmentalized their hardware by function, made the electrodynamics of the physics of bits purely an engineering problem, and relied on a formal (mathematical) system/language that was around long before computers to build them (and we had models for computers before computers). Interestingly, even though a computer is about as perfect exemplar of a deterministic, reductionist system there is, we can create programs that use deterministic rules to simulate/model systems that behave “chaotically” and unpredictability.

And back we come to “chaotic systems.” Simply put, “chaotic systems” are systems which behave in almost orderly ways. For example, a simple “chaotic system” might be behaving in a simple, regular way like cinderblocks piled on a metal support beam until some critical point in which the system radically shifts (in our example, the beam buckles and everything crashed to the ground). More complicated systems may forever oscillate in some interval or circle around some values, but continuously tracing out similar, not identical paths. Others can do this and in addition may suddenly switch to a different kind of almost identically repeating cycles. So what makes these systems interesting, not “chaotic”, and different from boring non-“chaotic” systems? A lot, but simplistically the reasons all relate to how these systems seem to exhibit order yet the kind of order they exhibit is both extremely sensitive to (initial) conditions and is “almost” ordered/regular. By “almost”, I mean that these systems can 1) actually behave just like “boring” orderly systems for a while and then some extremely tiny change can result in a completely different state or behavior and/or 2) can exhibit patterns of behavior that, when graphed, seem very structured and ordered because they tend to fluctuate around certain values or in certain intervals yet actually predicting what the precise value will be for a given time is difficult or impossible.

To wrap up, let me give an example of a “chaotic system” that is interesting only because it should be boring: a swinging pendulum. This is what is called a 1-body problem, because in classical physics its position and momentum at any time *t* is determined only using Newton’s law and knowledge of the systems mass and acceleration (in 2-body, 3-body,…,n-body problems one is dealing with interacting “parts” or bodies; in the classical example involving the motion of the moon around the Earth as the Earth moves around the Sun, the interaction is the gravitational attractions between these “bodies”). It’s boring. The thing just starts at some position and swings. It’s hard to even imagine what COULD be “chaotic” about it. Yet if we take Newton’s law F=ma and change it to describe a pendulum (namely, accounting for arc length and motion), we wind up with a model that we can’t solve. With additional information or numerical methods we can certainly find answers to the state of particular systems with particular initial conditions at particular times, but even though this simple system reduces to a single equation that (were it solved) could tell us everything we could wish to know about it, we can’t solve that equation.

Swinging pendulum can be solved in terms of elliptic functions. It is a completely integrable system and not chaotic. While I don’t like the name “chaos theory,” that smacks of pop-science, your claim that there is neither chaos no theory is nonsence, because there are both. Claiming that any system of ODEs is boring only shows that you don’t know much about the subject. A very cartoonist, low quality post.

Not exactly:

Greenspan, D. (2004). N-body Problems and Models. World Scientific.

Obviously, some pendulum problems can be solved analytically, all numerically, an some partially analytically (via e.g., the easy method involving restricting the range of movement). More importantly, my point was not to explain ODEs or PDEs or multivariate mathematics, but some popular conceptions regarding nonlinear/complex systems as they are described in popular science on “chaos theory”, and in particular that the term is something of a misnomer (and why). I find abstract algebras, functional analysis, the calculus of variations, and even difference equations fascinating, not boring. But I like math, and more importantly am familiar with it, while most are not, and therefore will find ODEs as boring as reading Hittite (unless they can read Hittite, in which case R or MATLAB code).

Try to read V.I. Arnold’s works on ODEs and classical mechanics, you may change your mind.

On what? Whether or not there exists an analytical method for constructing an exact solution to strongly damped pendulum motion? Or that ODEs are interesting (which I already believe)? I have hundreds textbooks in calculus, differential equations, and other elementary mathematics initially because of a research project in mathematics education but now out of a general concern. I work with differential equations almost on a daily basis. I really don’t think yet another textbook, no matter how classic (and whether you mean his text

Ordinary Differential Equations, which I own, or hisLectures on Partial Differential Equations, which I don’t, or some other text), is going to change my position and I am quite certain that it isn’t going to suddenly change the state of analysis (or classical mechanics). Feel free to object to the way in which I have tried to make interesting that which many find boring and strip it of technicalities (I object to many of my own simplifications). But it seems a bit ridiculous to object to what is absolutely true of our abilities to analytically solve exactly the example physical system I gave and/or my beliefs about what is or isn’t boring in mathematics when the truth of the first part is obvious and when I don’t actually believe that ODEs are boring at all but find much more “mundane” topics in mathematics fascinating (and, as stated, don’t find ODEs boring in the slightest).So I looked around to find a good public source rather than just one I could quote but that you might not own and one that isn’t too technical (it may be that more technical would be fine for you, but as I don’t know I decided to play it safe). I decided to go with “The Chaotic Pendulum” (http://farside.ph.utexas.edu/teaching/329/lectures/node46.html). See e.g., in the intro to this lecture “Up to now, we have mostly dealt with problems which are capable of analytic solution (so that we can easily validate our numerical solutions). Let us now investigate a problem which is quite intractable analytically, and in which meaningful progress can only be made via numerical means” and the pendulum examples which follow.

On ODE being boring.

Try his “Geometrical methods in ODE.”

Most of the calculus texts are pure trash that should be burned. But at least you are thinking about making the subject more accessible. I thought about it too. Have you read my paper at http://www.mathfoolery.com/Article/simpcalc-v1.pdf yet? You may like it.

I do like it, and thanks for the link!