Stephen Wolfram: A New Kind of Science
Overview
Wolfram’s book, A New Kind of Science is chiefly concerned with the implications of his deep study of cellular automata, originally triggered by his MacArthur grant. The main principles and findings of this research seem to be that simple rules can lead to computational complexity and very interesting results.
While the notion that simple rules can lead to powerful results is not a new notion, especially in the sense that science and mathematics have striven to find simple and elegant ways for describing the laws and theories governing nature. Wolfram’s pursuit seems to be towards computation, and relating the simplicity of cellular automata towards emergent natural phenomena. Wolfram aims for CAs to lead towards a new type of science and mathematics that uses the (simulative?) power of CAs to make useful claims about nature.
The visual appeal of the automata is certainly very compelling, but there is an equally disconcerting lack of mathematical reason backing up his arguments. Unfortunately, he also does not give evidence as to what mathematical justification might look like, instead choosing to demonstrate problems through visual analogy.
Wolfram’s use of CAs is also an exemplary demonstration of Baudrillard’s simulation, in that when viewed through the lens of cellular automata, everything seems to become one. CAs become the universal tool which may be used to represent recreate everything.
Notes
The Notion of Computation
The idea of Computational Universality becomes something of great significance here. A function is universal if it can compute anything that is computable. The Turing machine is the fundamental example of this. A consequence of universality is that universal functions may be simulated or computed analagously. A consequence of Wolfram’s research has been to find that certain classes of CAs are universal, may be used as computing machines.
Wolfram has additionally discovered a number of CAs which are reversible, that is, their inputs may be determined uniquely from their outputs. Computationally, this represents an interesting class of functions, but it also references issues of information and disorder that are important in signal systems and in thermodynamics.
The Principle of Computational Equivalence
Wolfram’s thesis is essentially this: “all processes, whether they are produced by human effort, or occur spontaneously in nature, can be viewed as computations.”
Wolfram extends this idea to the point of declaring it a new law of nature: “The Principle of Computational Equivalence introduces a new law of nature to the effect that no system can ever carry out explicit computations that are more sophisticated than those carried out by systems like cellular automata and Turing machines.” And thus, when viewing nature as a system of computation, this principle is naturally very relevant.
An issue with representing things as computations is that it disregards the idea that not everything requires brute computation, instead, certain things may be proven rather than computed. This distinction is tricky and important. Some information may be proven only with difficulty, and other facts may be much more easily computed than proven. However, there is generally a difference between that which is computed and that which is proven. The advantage of the later is eliminating the need for the former. Wolfram’s argument hinges on the notion of raw computation, which may pale in the face of abstract proof. One may set out to compute that there are infinitely many primes 3 mod 4, which is an indefinite exercise, or one may instead aim to prove this in a finite and short number of steps.
This point is important, but flawed. Later on, Wolfram examines rules and theorems of mathematics, and uses their countable, enumerable nature to represent them as computable. In this view, theorem proving is a process of computation, rather than some mysterious or magical intellectual exercise. This fact has been used in the past, notably by Kurt Goedel to prove the incompleteness of mathematics. This means that proofs are indeed finite and computable, but that is still not a good way of approaching them.
Computation is still computation and must obey the law that computations may not be “outdone”, as it is not possible to out-compute something in the same number of logical steps. On the other hand, proof and ideal computation are different from raw computation in that they might be more efficient and save time (or computational steps). The essence to these, the way that proofs are made and solved in practice is not by computation, but rather by “intuition” and experience. The two of these may seem magical in abstract, but actually echo more strongly the ideas of pattern matching. Instead of applying rules in brute force, pattern matching relies on analogy, recognition, and application of known rules to new inputs. This approach is still computable, but not easily by CAs.
Author/Editor | Wolfram, Stephen |
Title | A New Kind of Science |
Type | book |
Context | |
Tags | dms, simulation, emergence |
Lookup | Google Scholar, Google Books, Amazon |