Mathematics and the Game of Thrones

Mathematics and the Game of Thrones

Print Friendly, PDF & Email

Let us pick up where we left off in the last episode. The queen is dead, long live the queen! We’ll recall that in medieval European universities, theology was thought of as “the queen of the sciences,” but today, theology can make no such claim on this academic throne. To do so, I argued, she would need to fully embrace what we call “Big History.” Any kind of literalism about sacred scriptures, for instance, goes out the door. Instead, Theology and her sometimes twin sister Philosophy would understand themselves to be within the Great Matrix as defined by chronologies, scales of size, electromagnetic forces, energy density flows, and thresholds of emergent complexity. Emergent complexity, we noted, included different kinds of emergence: evolutionary emergence, developmental emergence, functional emergence, and intelligence emergence. In order to reclaim her throne, Theology would need to embrace all of the sciences and the humanities as the primary “revelation.”

In this game of thrones, mathematics certainly makes a stronger claim today to royalty than does theology or philosophy. Some regard mathematics as the rightful queen who unites all the scientific disciplines under her rules. STEM education — science, technology, engineering, and math — might better be spelled METS, putting mathematics first. Let’s examine these arguments. What is the status of mathematics in Big History? Is mathematics of the Great Matrix, or somehow above it? How does the reign of mathematics in the sciences unite, divide, and sometimes misguide us? These are questions that have engaged philosophers for thousands of years, since mathematical patterns first were discovered in the nature of things. Today, however, the depth, breadth, complexities, and practical utilities of symbolic logics have grown dramatically.

As a curriculum, Big History does not teach much math as such. Students learn about logarithmic scales and measurements, powers of 10, exponential functions, and some statistics. Big History is an invaluable supplement to, but certainly not a substitute for, traditional math and science courses. If properly taught, however, students should gain a deeper appreciation of mathematics across disciplines, which need not include the ability to understand and actually do the underlying math.

Mathematics is ubiquitous in natural phenomena and our technological creations, and it is essential for global economics and industry. From binary code to complex programming, all information and processes performed by a computer need to be represented as mathematically formulated rules and algorithms. Mathematics in the form of computers created the most important scientific revolution in the past decades across all academic disciplines. Information technology now makes it possible to collect and analyze large datasets. It is now possible to create complex models and simulations of natural phenomena. And of course, the Internet makes it possible to instantaneously share information and effectively collaborate with colleagues around the world. Watching a video, talking on the phone, updating your Facebook page — the content may have nothing to do with mathematics as such, but all had to pass through a digital, mathematical universe before it could appear in your universe.

Mathematics is intimately involved in all of our digital devices, but sadly not widely understood or appreciated. E=mc2 is as much a cultural icon as 2+2=4, yet we are mostly unaware of the ubiquity of advanced mathematics in our daily lives. Big History is an opportunity to change that.

For Plato, mathematics provided an insight into the true nature of reality. He understood a mathematical abstraction to be more real than any material instantiation. In other words, the idea of 2+2=4 as a formula is more real than the counting of 2 apples plus 2 apples. Platonism profoundly affected the conception of God in the theology and evolution of Judaism, Christianity, and Islam. And while philosophers widely reject Neo-Platonism today, many physicists and mathematicians remain sympathetic. Query a hard-nosed physicist about the status of mathematics and you’re likely to find a softhearted Neo-Platonist.

“How can it be that mathematics,” Albert Einstein wrote, “being after all a product of human thought which is independent of experience, is so admirably appropriate to the objects of reality?” While Einstein rejected the idea of a personal God with a human-like personality, he continued to believe in a Platonic God of mathematics that was somehow immaterial, transcendent, and necessary. “The most incomprehensible thing about the universe,” Einstein wrote, “is that it is comprehensible.”

“The unreasonable effectiveness of mathematics in the natural sciences,” as Eugene Wigner titled his famous paper in 1960, remains something of an epistemological mystery, which then leads us to speculate about the ontological status of mathematics. Is mathematics discovered or invented? If it is discovered, then mathematics is somehow “real, but immaterial.” If it is invented, then mathematics is merely a practical tool that may or may not be useful in a particular context. The history of mathematics is clearly a process of invention and development—but from the inside, it never feels that way. There is a kind of internal necessity in these symbolic logics that compels certain solutions. And these symbolic logics often elegantly and miraculously describe real phenomena and patterns in nature. This is particularly the case in physics, but increasingly so in other disciplines as well.

The 20th-century mathematician Charles Shannon formalized information theory by reducing the smallest and most fundamental unit of information to a “bit” — either a 0 or a 1. Computer programs treat groups of binary digits as units known as “bytes,” typically consisting of 8 bits. With a string of eight 0s and 1s, it is now possible to encode 255 discrete entities like letters of the alphabet or logical functions. Computers interpret electrical relays and magnetic polarities as 0s or 1s, but bits can also be represented in holes in pieces of paper or mechanical gears. The 19th-century mathematician Charles Babagges designed a mechanical computer that could hypothetically do everything that a digital computer can now do (although it was much larger and slower than an electromagnetic device). The next-generation computers may someday soon be “mechanical” devices built with nanotechnology rather than with semi-conductors.

Information is understood to be independent of the substrate. Indeed, some physicists have speculated that information is the substrate of the universe itself, that the material world is actually derived from a mathematical world, much as Plato thought. “It from Bit” is how the 20th-century physicist John Wheeler famously summarized this conception. In 1990, he wrote:

‘It from bit’ symbolizes the idea that every item of the physical world has at bottom—a very deep bottom, in most instances—an immaterial source and explanation; that which we call reality arises in the last analysis from the posing of yes–no questions and the registering of equipment-evoked responses; in short, that all things physical are information-theoretic in origin and that this is a participatory universe.

While it is true that human language and logic can be represented in symbol systems— symbols and functions that can ultimately be represented in binary codes—it is not necessarily the case that all of nature is therefore digital and mathematical, as Wheeler and others propose. This information-theoretic view is a useful analogy, but digital literalism is potentially also a huge category mistake.

In both Europe and the United States, projects to simulate the human brain through computers are in the works. No doubt much will be learned in trying to do so, but the wetware of the human brain is not the hardware of a computer. And the synaptic connections between neurons are not simply “open” and “close” binary gates. The analogy breaks down.

There are other limits to mathematics and the digital representation of nature. These limits to computation are both theoretical and practical. There are, for instance, intractable problems that require hopelessly large amounts of time to solve even relatively small inputs. Computer encryption depends on this fact. It may be that the genome, in dynamic relationship with proteins and its environment, is also in some sense “encrypted” by its complexity. And it may be that the mind-brain is similarly “encrypted,” in which case, we will never fully be able to understand, let alone reliably control, life and mind no matter how exponentially our scientific knowledge grows or how fast technological know-how accelerates.

The computer age adage “garbage in, garbage out” reminds us also that complex algorithms and calculations can be poorly designed and that data inputs can be useless. Once having been discovered and invented, mathematics becomes a language that need not refer to anything but itself. Sometimes, mathematical reasoning leads us to profound discoveries about the universe, as appears to be the case with the Higgs boson, whose existence was predicted in 1964 and recently confirmed at CERN. Multiverse theory and string theory, on the other hand, may be the 21st-century equivalent to counting how many angels fit on the head of a pin. It is not clear that these mathematical-deduced theories can ever be empirically proven, though the mathematics is elegant and compelling.

The esteemed biologist E.O. Wilson recently confessed to being mathematically semiliterate in the pages of The Wall Street Journal. His confession was meant as an encouragement to young people considering careers in science who might lack mathematical aptitude. The message: Mathematics is a useful tool in biology, but not the only or even most important tool. As Wilson observes:

The annals of theoretical biology are clogged with mathematical models that either can be safely ignored or, when tested, fail. Possibly no more than 10% have any lasting value. Only those linked solidly to knowledge of real living systems have much chance of being used.

Mathematics is like learning a foreign language. It is best started early in life. And if you don’t keep using it, you will lose it. Once upon a time in high school, I excelled in advanced calculus and dabbled in non-Euclidian geometry. I retain some memories thereof, but no longer the ability to “converse” in these languages, lost somewhere perhaps in the recesses of my brain along with my now rusty Russian and Hebrew. But having learned a modicum of advanced mathematics, I retain a tremendous respect and appreciation for these languages of symbolic logic.

Mathematics may not be “the queen of the sciences,” but she is certainly of royal birth and deportment. Mathematics is both of and above the Great Matrix, as Plato, Einstein, Wheeler, and others intuited. Much of what we have learned about Big History is due to applied mathematics. The Great Acceleration of the 20th century can itself be attributed to METS — math, engineering, technology, and science — to reverse the STEM acronym yet again. We need not know advanced mathematics ourselves to pay homage to mathematics, but if you are young, bright, curious, and ambitious, you will be well rewarded by studying its nature, power, and rules.