Numerical analysts played a leading role in creating the field of computer science in the 1960s. There were Gautschi and Rice at Purdue, Forsythe and Golub at Stanford, Bauer in Munich, Stiefel and Rutishauser in Zurich, Bennett in Sydney, Fox at Oxford, Gear at Illinois, Dahlquist in Stockholm,….
Half a century on, there have been 70 winners of the Turing Award, computer science’s highest honor. How many have been numerical analysts? The answer is three, if you include Richard Hamming (1968). Wilkinson (1970) and Kahan (1989) were the other two, 30+ years ago.
[4 June 2019]
Every floating-point rounding error, Wilkinson teaches us, can be interpreted by backward error analysis: it’s the exact result, but for slightly perturbed data.
Every quantum mechanical measurement, Everett teaches us, can be interpreted as the splitting of two universes: one where this choice happens, the other with the other.
I like to imagine a kind of Everett–Wilkinson interpretation. Every floating-point operation is exact, hooray! — but each time one happens, we are jiggled ever so slightly, just one part in 1016, into a neighboring universe.
[17 February 2015]
Graham Farmelo gave a lecture here at Oxford yesterday about his book The Universe Speaks in Numbers. His theme, to quote from the web site, is that physicists these days seek the laws of nature “with the help of cutting-edge mathematics.”
This situation is controversial. Everyone knows that Newton and Gauss did real physics, but what about the two modern heroes Farmelo focused on, Atiyah and Witten? Is string theory real physics?
An unexpected twist in the discussion turned up at coffee just now with André Weideman, Yuji Nakatsukasa, and Trond Steihaug. Steihaug is studying Newton’s notebooks, and he was telling us how they show Newton carrying out systematic numerical calculations to many digits of accuracy. Gauss, too, was known for his extensive calculations.
Suddenly we noticed the oddity in Farmelo’s title, displayed on the poster behind us. The universe speaks in “numbers”? Of course, the word is intended as shorthand for mathematics. But that shorthand is long obsolete, for in fact, most mathematicians of recent generations have very little interest in the real numbers Newton and Gauss were calculating! I just checked and found that the only numbers that appear in the 106-page joint paper by Atiyah and Witten are the integers 0, 1,…, 10, 11, 24, 27, 36, 48, 72, and 144, together with i, π, and e.
All this suggests a possibility Farmelo cannot have intended. Maybe the universe indeed speaks in numbers; and maybe physics lost its way when it came under the spell of a kind of mathematics that is number-free?
[17 May 2019]
Since writing Approximation Theory and Approximation Practice a decade ago, I’ve found myself determined to know where each idea comes from. No matter what I am working on I need to know, who did this first, and when?
Lately I have realized that this predilection of mine springs from two strongly held views. One (moral) was expressed in “The diameter of intellectual space” in 2004. The other (intellectual) is my growing concern that mathematics has lost its way. Over and over again we see the pattern that one mathematician introduces a fundamental idea, and then others advance the topic. Obsessively, at the expense of all other developments, they focus on bringing the idea to its most general and technically difficult limit. Thus Runge’s theorem of 1885, which shows you can approximate certain functions, ends up viewed as a special case of Mergelyan’s theorem of 1951. What is Mergelyan’s theorem? Well, it’s the same as Runge’s! — but with weaker smoothness assumptions. Impressive and important, yes. Yet how sad it is that if you try to look up this subject, you’ll probably finding yourself reading about the technicalities, not about the actual point of it all.
I believe that because of mathematicians’ habit of making topics ever more technical, the original version of a mathematical idea is often more to the point than what followed. That’s why the originals mean so much to me.
[13 May 2019]
27 years ago I wrote,
There are some 1010 people in the world. Roughly speaking, I am among the top 105 most successful. Accounting for this improbably high position in the hierarchy is a troubling problem for me. A tantalizing idea suggests itself: can one argue somehow that in a random population of n people, the expected position to find an introspective person of my sort is on the order of √n from the top?
Well, here’s an argument that gets exactly that answer. The person in position #1 will be pretty interested in his or her good fortune. Suppose we imagine that person #2 is half as interested, person #3 is 1/3 as interested, and so on: essentially, each person’s interest in their luck is proportional to their luck. Then the total amount of human interest in positional luck, integrated over the whole of humanity, is about log n.
Now pick a human being at random, weighted by their interest in their luck. The middle position is ½log n, which is the logarithm of √n. So there you have it. If I am person number 105, then half the population’s interest in their luck is to be found in people higher up than I am, and half in people lower down, making me quite entirely typical. Throwing in a few trillion dogs, squirrels, and mosquitoes, with their lower levels of introspection, won’t change the result.
[12 April 2011]
Here is a curious symmetry. To achieve randomness in science or technology, our best strategy is exponentials. You can toss a coin, but the outcome isn’t so random because it is sensitive only algebraically to the details of the throw. For truer randomness you need a chaotic system with exponential sensitivities, like a pinball machine or the Lorenz equations. Run such a system for a moment and your randomness might be 99%. If that’s not enough, run it a little longer to get 99.99%. The point is that with each new step, your knowledge about the system shrinks by a constant factor, soon reaching zero for practical purposes.
And to achieve certainty, our best strategy is exponentials again! At the level of fundamental physics, anything can happen because of quantum tunnelling. But some things “never” happen in practice, such as the radioactive decay of an iron-56 atom. Why? Because the frequency of quantum events shrinks exponentially with the width of a potential barrier. Thickening up that barrier in a physics experiment is like adding another level of error correction in an electronic circuit or taking another step of a random algorithm or making another compressed sensing measurement. With each new step, your uncertainty about the system shrinks by a constant factor, soon reaching zero for practical purposes.
[1 September 2011]
Men being bigger and stronger than women, I was musing how in the old days, a man used his strength to protect his woman. However evolved you may be, it’s hard not to feel a tinge of pride in being associated with this classic role.
These fine feelings were dimmed a little by the thought that, realistically speaking, probably about 80% of that protection was against other men.
[15 June 2018]