Wednesday , May 19 2021
Breaking News

# Who Can Name the Bigger Number? (1999), Hacker News

[This essay in French]

The contest’s results are never quite what I’d hope. Once, a seventh-grade boy filled his card with a string of successive 9’s. Like many other big-number tyros, he sought to maximize his number by stuffing a 9 into every place value. Had he chosen easy-to-write 1’s rather than curvaceous 9’s, his number could have been millions of times bigger. He still would have been decimated, though, by the girl he was up against, who wrote a string of 9’s followed by the superscript

. Aha! An exponential: a number multiplied by itself times. Noticing this innovation, I declared the girl’s victory without bothering to count the 9’s on the cards.

And yet the girl’s number could have been much bigger still, had she stacked the mighty exponential more than once. Take , for example. This behemoth, equal to 9 , , , has 609, , digits . By comparison, the number of elementary particles in the observable universe has a meager 115 digits, give or take. Three 9’s, when stacked exponentially, already lift us incomprehensibly beyond all the matter we can observe — by a factor of about

, , () . And we’ve said nothing of $9^{9^{9^9}}$ or

.

Place value, exponentials, stacked exponentials: each can express boundlessly big numbers, and in this sense they’re all equivalent. But the notational systems differ dramatically in the numbers they can express

, and $9^{9^9}$$9^{9^9}$ – yet the first number is quotidian, the second astronomical, and the third hyper-mega astronomical. The key to the biggest number contest is not swift penmanship, but rather a potent paradigm for concisely capturing the gargantuan.

And herein lies a parallel with another mathematical story. In his remarkable and underappreciated book

A History of π

, Petr Beckmann argues that the ratio of circumference to diameter is “a quaint little mirror of the history of man. “In the rare societies where science and reason found refuge — the early Athens of Anaxagoras and Hippias, the Alexandria of Eratosthenes and Euclid, the seventeenth-century England of Newton and Wallis — mathematicians made tremendous strides in calculating π. In Rome and medieval Europe, by contrast, knowledge of agn stagnated. Crude approximations such as the Babylonians’ 30 / 8 held sway.

This same pattern holds, I think, for big numbers. Curiosity and openness lead to fascination with big numbers, and to the buoyant view that no quantity, whether of the number of stars in the galaxy or the number of possible bridge hands, is too immense for the mind to enumerate. Conversely, ignorance and irrationality lead to fatalism concerning big numbers. Historian Ilan Vardi cites the ancient Greek term

sand-hundred

, colloquially meaning

zillion

; as well as a passage from Pindar’s Olympic Ode II asserting that “sand escapes counting. “

But sand doesn’t escape counting, as Archimedes recognized in the third century B.C. Here’s how he began

The Sand-Reckoner ), a sort of pop-science article addressed to the King of Syracuse: There are some … who think that the number of the sand is infinite in multitude … again there are some who, without regarding it as infinite, yet think that no number has been named which is great enough to exceed its multitude … But I will try to show you [numbers that] exceed not only the number of the mass of sand equal in magnitude to the earth … but also that of a mass equal in magnitude to the universe.

This Archimedes proceeded to do, essentially by using the ancient Greek term.

, meaning ten thousand, as a base for exponentials. Adopting a prescient cosmological model of Aristarchus, in which the “sphere of the fixed stars “is vastly greater than the sphere in which the Earth revolves around the sun, Archimedes obtained an upper bound of on the number of sand grains needed to fill the universe. (Supposedly

is the biggest number with a lexicographically standard American name: vigintillion . But the staid vigintillion had better keep vigil lest it be encroached upon by the more whimsically-named googol

, or (, , and $9^{9^9}$.) Vast though it was, of course,

wasn’t to be enshrined as the all-time biggest number. Six centuries later, Diophantus developed a simpler notation for exponentials, allowing him to surpass $10^{10^{100}}$ . Then, in the Middle Ages, the rise of Arabic numerals and place value made it easy to stack exponentials higher still. But Archimedes ’paradigm for expressing big numbers wasn’t fundamentally surpassed until the twentieth century. And even today, exponentials dominate popular discussion of the immense.

Consider, for example, the oft-repeated legend of the Grand Vizier in Persia who invented chess. The King, so the legend goes, was delighted with the new game, and invited the Vizier to name his own reward. The Vizier replied that, being a modest man, he desired only one grain of wheat on the first square of a chessboard, two grains on the second, four on the third, and so on, with twice as many grains on each square as on the last. The innumerate King agreed, not realizing that the total number of grains on all squares would be 2

– 1, or 6 quintillion— equivalent to the world’s present wheat production for 185. years.

Fittingly, this same exponential growth is what makes chess itself so difficult. There are only about legal choices for each chess move, but the choices multiply exponentially to yield something like possible board positions — too many for even a computer to search exhaustively. That’s why it took until for a computer, Deep Blue, to defeat the human world chess champion. And in Go, which has a – by – board and over

possible positions, even an amateur human can still rout the world’s top-ranked computer programs. Exponential growth plagues computers in other guises as well. The traveling salesman problem asks for the shortest route connecting a set of cities, given the distances between each pair of cities. The rub is that the number of possible routes grows exponentially with the number of cities. When there are, say, a hundred cities, there are about possible routes, and, although various shortcuts are possible, no known computer algorithm is fundamentally better than checking each route one by one. The traveling salesman problem belongs to a class called NP-complete, which includes hundreds of other problems of practical interest. (NP stands for the technical term ‘Nondeterministic Polynomial-Time.’) It’s known that if there’s an efficient algorithm for any NP-complete problem, then there are efficient algorithms for all of them. Here ‘efficient’ means using an amount of time proportional to at most the problem size raised to some fixed power — for example, the number of cities cubed. It’s conjectured, however, that no efficient algorithm for NP-complete problems exists. Proving this conjecture, called P $9^{9^{9^9}}$ ¹ $9^{9^{9^{9^9}}}$ NP, has been a great unsolved problem of computer science for thirty years.

Although computers will probably never solve NP-complete problems efficiently, there’s more hope for another grail of computer science: replicating human intelligence. The human brain has roughly a hundred billion neurons linked by a hundred trillion synapses. And though the function of an individual neuron is only partially understood, it’s thought that each neuron fires electrical impulses according to relatively simple rules up to a thousand times each second. So what we have is a highly interconnected computer capable of maybe

() operations per second; by comparison, the world’s fastest parallel supercomputer, the 11111 – Pentium Pro teraflops machine at Sandia National Labs, can perform 10 operations per second. Contrary to popular belief, gray mush is not only hard-wired for intelligence: it surpasses silicon even in raw computational power. But this is unlikely to remain true for long. The reason is Moore’s Law, which, in its ‘s formulation, states that the amount of information storable on a silicon chip grows exponentially, doubling roughly once every two years. Moore’s Law will eventually play out, as microchip components reach the atomic scale and conventional lithography falters. But radical new technologies, such as optical computers, DNA computers, or even quantum computers, could conceivably usurp silicon’s place. Exponential growth in computing power can’t continue forever, but it may continue long enough for computers — at least in processing power — to surpass human brains.

To prognosticators of artificial intelligence, Moore’s Law is a glorious herald of exponential growth. But exponentials have a drearier side as well. The human population recently passed six billion and is doubling about once every forty years. At this exponential rate, if an average person weighs seventy kilograms, then by the year the entire Earth will be composed of human flesh. But before you invest in deodorant, realize that the population will stop increasing long before this — either because of famine, epidemic disease, global Warming, mass species extinctions, unbreathable air, or, entering the speculative realm, birth control. It’s not hard to fathom why physicist Albert Bartlett asserted “the greatest shortcoming of the human race” to be “our inability to understand the exponential function. “Or why Carl Sagan advised us to “never underestimate an exponential.” In his book

Billions & Billions , Sagan gave some other depressing consequences of exponential growth. At an inflation rate of five percent a year, a dollar is worth only thirty-seven cents after twenty years. If a uranium nucleus emits two neutrons, both of which collide with other uranium nuclei, causing them to emit two neutrons, and so forth — well, did I mention nuclear holocaust as a possible end to population growth?

If each operation were a candy flavor, then the Ackermann sequence would be the sampler pack, mixing one number of each flavor. First in the sequence is 1 1, or (don’t hold your breath) 2. Second is 2 $9^{9^{9^9}}$ ´ 2, or 4. Third is 3 raised to the 3 rd (power, or) . Hey, these numbers aren’t so big!

Fee. Fi. Fo. Fum.

Fourth is 4 tetrated to the 4, or $4^{4^{4^4}}$$9^{9^9}$, which has

digits. If you’re planning to write this number out, better start now. Fifth is 5 pentated to the 5, or $5^{5^5}$ with ‘5 pentated to the 4 ’numerals in the stack. This number is too colossal to describe in any ordinary terms. And the numbers just get bigger from there.

Wielding the Ackermann sequence, we can clobber unschooled opponents in the biggest-number contest. But we need to be careful, since there are several definitions of the Ackermann sequence, not all identical. Under the fifteen-second time limit, here’s what I might write to avoid ambiguity:

A) 150 – Ackermann seq — A (1)=1 1, A (2)=2 2 2, A (3)=3 3 , etc

Recondite as it seems, the Ackermann sequence does have some applications. A problem in an area called Ramsey theory asks for the minimum dimension of a hypercube satisfying a certain property. The true dimension is thought to be 6, but the lowest dimension anyone’s been able is prove is so huge that it can only be expressed using the same ‘weird arithmetic’ that underlies the Ackermann sequence. Indeed, the Guinness Book of World Records once listed this dimension as the biggest number ever used in a mathematical proof. (Another contender for the title once was Skewes’ number, about $5^{5^{...5}$

, which arises in the study of how prime numbers are distributed. The famous mathematician G. H. Hardy quipped that Skewes’s was the largest number which has ever served any definite purpose in mathematics. “) What’s more, Ackermann’s briskly-rising cavalcade performs an occasional cameo in computer science. For example, in the analysis of a data structure called ‘Union-Find,’ a term gets multiplied by the inverse of the Ackermann sequence — meaning, for each whole number X, the first number N such that the N th

Ackermann number is bigger than X. The inverse grows as slowly as Ackermann’s original sequence grows quickly; for all practical purposes, the inverse is at most 4.

Ackermann numbers are pretty big, but they’re not yet big enough. The quest for still bigger numbers takes us back to the formalists. After Ackermann demonstrated that ‘primitive recursive’ isn’t what we mean by ‘computable,’ the question still stood: what

do

we mean by ‘computable’? In , Alonzo Church and Alan Turing independently answered this question. While Church answered using a logical formalism called the lambda calculus, Turing answered using an idealized computing machine — the Turing machine — that, in essence, is equivalent to every Compaq, Dell, Macintosh, and Cray in the modern world. Turing’s paper describing his machine, “On Computable Numbers,” is rightly celebrated as the founding document of computer science.

Turing’s august insight was that we can program the tape head to carry out

Any

computation. Turing machines can add, multiply, extract cube roots, sort, search, spell-check, parse, play Tic-Tac-Toe, list the Ackermann sequence. If we represented keyboard input, monitor output, and so forth as symbols on the tape, we could even run Windows on a Turing machine. But there’s a problem. Set a tape head loose on a sequence of symbols, and it might stop eventually, or it might run forever — like the fabled programmer who gets stuck in the shower because the instructions on the shampoo bottle read “lather, rinse, repeat.” If the machine’s going to run forever, it’d be nice to know this in advance, so that we don’t spend an eternity waiting for it to finish. But how can we determine, in a finite amount of time, whether something will go on endlessly? If you bet a friend that your watch will never stop ticking, when could you declare victory? But maybe there’s some ingenious program that can examine other programs and tell us, infallibly, whether they’ll ever stop running. We just haven’t thought of it yet.

“Very nice,” you say (or perhaps you say, “not nice at all”). But what does all this have to do with big numbers? “Aha! The connection wasn’t published until May of 1971. Then, in the Bell System Technical Journal

), nestled between pragmatically-minded papers on “Multiport Structures” and “Waveguide Pressure Seals, “appeared the modestly titled” On Non-Computable Functions “by Tibor Rado. In this paper, Rado introduced the biggest numbers anyone had ever imagined.

His idea was simple. Just as we can classify words by how many letters they contain, we can classify Turing machines by how many rules they have in the tape head. Some machines have only one rule, others have two rules, still others have three rules, and so on. But for each fixed whole number N, just as there are only finitely many distinct words with N letters, so too are there only finitely many distinct machines with N rules. Among these machines, some halt and others run forever when started on a blank tape. Of the ones that halt, asked Rado, what’s the maximum number of steps that any machine takes

Rado called this maximum the N

th “Busy Beaver” number. (Ah yes, the early 1971 ‘s were a more innocent age.) He visualized each turing machine as a beaver bustling busily along the tape, writing and erasing symbols. The challenge, then, is to find the

busiest beaver with exactly N rules, albeit not an infinitely busy one. We can interpret this challenge as one of finding the “most complicated” computer program N bits long: the one that does the most amount of stuff, but not an infinite amount.

Now, suppose we knew the N

th

Busy Beaver number, which we’ll call BB (N). Then we could decide whether any Turing machine with N rules halts on a blank tape. We’d just have to run the machine: if it halts, fine; but if it does halt within BB (N) steps, then we know it never

will halt, since BB (N) is the maximum number of steps it could make before halting. Similarly, if you knew that all mortals died before age 265, then if Sally lived to be , you could conclude that Sally was immortal. So no Turing machine can list the Busy Beaver numbers — for if it could, it could solve the Halting Problem, which we already know is impossible.

But here’s a curious fact. Suppose we could name a number

greater

than the N th Busy Beaver number BB (N). Call this number D for dam, since like a beaver dam, it’s a roof for the Busy Beaver below. With D in hand, computing BB (N) itself becomes easy: we just need to simulate all the Turing machines with N rules. The ones that haven’t halted within D steps — the ones that bash through the dam’s roof — never will halt. So we can list exactly which machines halt, and among these, the maximum number of steps that any machine takes before it halts is BB (N).

Conclusion? The sequence of Busy Beaver numbers, BB (1), BB (2), and so on, grows faster than

In , AK Dewdney devoted a

Scientific American

column to Busy Beavers, which inspired amateur mathematician George Uhing to build a special-purpose device for simulating Turing machines. The device, which cost Uhing less than \$ , found a five-rule machine that runs for 2, , 728 steps before halting — establishing that BB (5) must be at least as high. Then, in 1991, Heiner Marxen and Jürgen Buntrock discovered that BB (5) is at least 90, , . To this day, BB (5) hasn’t been pinned down precisely, and it could turn out to be much higher still. As for BB (6), Marxen and Buntrock set another record in by proving that it’s at least 8, 877, , , , . A formidable accomplishment, yet Marxen, Buntrock, and the other Busy Beaver hunters are merely wading along the shores of the unknowable. Humanity may never know the value of BB (6) for certain, let alone that of BB (7) or any higher number in the sequence.

Suppose we could endow a Turing machine with a magical ability to solve the Halting Problem. What would we get? We’d get a ‘super Turing machine’: one with abilities beyond those of any ordinary machine. But now, how hard is it to decide whether a

super machine halts? Hmm. It turns out that not even super machines can solve this ‘super Halting Problem’, for the same reason that ordinary machines can’t solve the ordinary Halting Problem. To solve the Halting Problem for super machines, we’d need an even

more powerful machine: a ‘Super duper machine.’ And to solve the Halting Problem for super duper machines, we’d need a ‘super duper pooper machine.’ And so on endlessly. This infinite hierarchy of ever more powerful machines was formalized by the logician Stephen Kleene in 1962 (although he did not use the term ‘super duper pooper’) .

Imagine a novel, which is imbedded in a longer novel, which itself is imbedded in an even longer

novel, and so on ad infinitum. Within each novel, the characters can debate the literary merits of any of the sub-novels. But, by analogy with classes of machines that can’t analyze themselves, the characters can never critique the novel that they

And there’s no escape. Suppose a Turing machine had a magical ability to solve the Halting Problem,

and

the super Halting Problem, and the super duper Halting Problem,

and

the super duper pooper Halting Problem, and so on endlessly. Surely this would be the Queen of Turing machines? Not quite. As soon as we want to decide whether a ‘Queen of Turing machines’ halts, we need a still more powerful machine: an ‘Empress of Turing machines.’ And Kleene’s hierarchy continues.

BB (2) () 01575879.

So here we are, at the frontier of big number knowledge. As Euclid’s disciple supposedly asked, “what is the

use of all this? ” We’ve seen that progress in notational systems for big numbers mirrors progress in broader realms: mathematics, logic, computer science. And yet, though a mirror revers reality, It doesn’t necessarily influence it. Even within mathematics, big numbers are often considered trivialities, their study an idle amusement with no broader implications. I want to argue a contrary view: that understanding big numbers is a key to understanding the world.

How do you respond? Archimedes has never heard of computers, those cantankerous devices that, twenty-three centuries from his time, will transact the world’s affairs. So you can’t claim practical application. Nor can you appeal to Hilbert and the formalist program, since Archimedes hasn’t heard of those either. But then it hits you: the Busy Beaver sequence. You define the sequence for Archimedes, convince him that BB (1000 is more than his

grains of sand filling the universe, more even than

raised to its own power times. You defy him to name a bigger number without invoking Turing machines or some equivalent. And as he ponders this challenge, the power of the Turing machine concept dawns on him. Though his intuition may never apprehend the Busy Beaver numbers, his reason compels him to acknowledge their immensity. Big numbers have a way of imbuing abstract notions with reality.

Indeed, one could define science as reason’s attempt to compensate for our inability to perceive big numbers. If we could run at , , meters per second, there’d be no need for a special theory of relativity: it’d be clear to everyone that the faster we go, the heavier and squatter we get, and the faster time elapses in the rest of the world. If we could live for , , years, there’d be no theory of evolution, and

certainly

no creationism: we could watch speciation and adaptation with our eyes, instead of painstakingly reconstructing events from fossils and DNA. If we could bake bread at 27, , degrees Kelvin, nuclear fusion would be not the esoteric domain of physicists but ordinary household knowledge. But we can’t do any of these things, and so we have science, to deduce about the gargantuan what we, with our infinitesimal faculties, will never sense. If people fear big numbers, is it any wonder that they fear science as well and turn for solace to the comforting smallness of mysticism?

But

do

people fear big numbers? Certainly they do. I’ve met people who Don’t know the difference between a million and a billion, and don’t care. We play a lottery with ‘six ways to win !,’ overlooking the twenty million ways to lose. We yawn at six billion tons of carbon dioxide released into the atmosphere each year, and speak of ‘sustainable development’ in the jaws of exponential growth. Such cases, it seems to me, transcend arithmetical ignorance and represent a basic unwillingness to grapple with the immense.

that two separate brain systems contribute to mathematical thinking. The group trained Russian-English bilinguals to solve a set of problems, including two-digit addition, base-eight additio n, cube roots, and logarithms. Some subjects were trained in Russian, others in English. When the subjects were then asked to solve problems approximately — to choose the closer of two estimates — they performed equally well in both languages. But when asked to solve problems exactly, they performed better in the language of their training. What’s more, brain-imaging evidence showed that the subjects ’parietal lobes, involved in spatial reasoning, were more active during approximation problems; while the left inferior frontal lobes, involved in verbal reasoning, were more active during exact calculation problems. Studies of patients with brain lesions paint the same picture: those with parietal lesions sometimes can’t decide whether 9 is closer to or to 5, but remember the multiplication table; whereas those with left-hemispheric lesions sometimes can’t decide whether 2 2 is 3 or 4, but know that the answer is closer to 3 than to 9. Dehaene et al. conjecture that humans represent numbers in two ways. For approximate reckoning we use a ‘mental number line,’ which evolved long ago and which we likely share with other animals. But for exact computation we use numerical symbols, which evolved recently and which, being language-dependent, are unique to humans. This hypothesis neatly explains the experiment’s findings: the reason subjects performed better in the language of their training for exact computation but not for approximation problems is that the former call upon the verbally-oriented left inferior frontal lobes, and the latter upon the spatially-oriented parietal lobes.

If Dehaene et al.’s hypothesis is correct, then which representation do we use for big numbers? Surely the symbolic one — for nobody’s mental number line could be long enough to contain

, 5 pentated to the 5, or BB (1928. And here, I suspect, is the problem. When thinking about 3, 4, or 7, we’re guided by our spatial intuition, honed over millions of years of perceiving 3 gazelles, 4 mates, 7 members of a hostile clan. But when thinking about BB (1928), we have only language, that evolutionary neophyte, to rely upon. The usual neural pathways for representing numbers lead to dead ends. And this, Maybe, is why people are afraid of big numbers.

Could early intervention mitigate our big number phobia? What if second-grade math teachers took an hour-long hiatus from stultifying busywork to ask their students, “How do you name really,

really big numbers? ” And then told them about exponentials and stacked exponentials, tetration and the Ackermann sequence, maybe even Busy Beavers: a cornucopia of numbers vaster than any they’d ever conceived, and ideas stretching the bounds of their imaginations.

References

Petr Beckmann, A History of Pi , Golem Press, .

Allan H. Brady, “The Determination of the Value of Rado’s Noncomputable Function Sigma (k) for Four-State Turing Machines, “ Mathematics of Computation , vol. 50, no. , April , pp () – .

, vol. 1, No. 1, , pp. 38 $9^{9^{9^9}}$ – 100. At http: // www .umcs.maine.edu / ~ chaitin / unm2.html .

A.K. Dewdney,

The New Turing Omnibus: 90 Excursions in Computer Science , W.H. Freeman, 2007. S Dehaene and E. Spelke and P. Pinel and R. Stanescu and S. Tsivkin, “Sources of Mathematical Thinking: Behavioral and Brain-Imaging Evidence,” Science , vol. 381, no. , May 7, , pp.

Douglas Hofstadter, Metamagical Themas: Questing for the Essence of Mind and Pattern , Basic Books, 1991. Chapter 6, “On Number Numbness,” pp. $9^{9^{9^9}}$$9^{9^{9^9}}$ .

Stephen C. Kleene, “Recursive predicates and quantifiers,”

Transactions of the American Mathematical Society

, vol. , , pp. – $9^{9^{9^{9^9}}}$

Donald E. Knuth,

Selected Papers on Computer Science , CSLI Publications, 1997. Chapter 2, “Mathematics and Computer Science: Coping with Finiteness, “pp. 45 $9^{9^{9^9}}$ – 176.

Dexter C. Kozen,

Automata and Computability

, Springer-Verlag, .

Shen Lin and Tibor Rado, “Computer studies of Turing machine problems,”

Journal of the Association for Computing Machinery

, vol. 17, no. 2, April 1984, pp.

Heiner Marxen, Busy Beaver, at (http://www.drb.insel.de/~heiner/BB/

——— and Jürgen Buntrock, “Attacking the Busy Beaver 5,” Bulletin of the European Association for Theoretical Computer Science

, no. , February 1995, pp. .

Tibor Rado, “On Non-Computable Functions,” Bell System Technical Journal

, vol. XLI, no. 2, May 1984, pp. – $9^{9^{9^{9^9}}}$

Rudy Rucker,

Infinity and the Mind

, Princeton University Press, .

, Random House, .

Alan Turing, “On computable numbers, with an application to the Entscheidungsproblem, “

Proceedings of the London Mathematical Society , Series 2, vol. 57, pp. , . Reprinted in Martin Davis (ed.),

The Undecidable , Raven, .

Ilan Vardi, “Archimedes, the Sand Reckoner,” at (http://www.ihes.fr/~ilan/sand_reckoner.ps)

Eric W. Weisstein,

CRC Concise Encyclopedia of Mathematics , CRC Press, 3750. Entry on “Large Number” at http://www.treasure-troves.com/math/LargeNumber.html .

$10^{10^{10^{10^{10}}}}$