The greatest change in the applied mathematical landscape in my career, apart from the advance of computers, has been the penetration of probabilistic ideas into every corner of our work. Differential equations become stochastic differential equations; deterministic algorithms become randomized; simulation gives way to uncertainty quantification. When I was a graduate student, hardly anyone outside of statistics departments worked on stochastic problems — certainly none of us numerical analysts in Serra House at Stanford. Nowadays if you don’t, you are old-fashioned.
I got a chance to quantify this trend when I gave the keynote lecture yesterday at the annual meeting of the MIT Center for Computational Science and Engineering. At the end of the question period I asked the audience — about 75 graduate students and postdocs, the CSE leaders of the future — how many of you work on problems with a probabilistic component? We did a Zoom poll, and the answer was 63%.
[16 March 2021]
Professor Trefethen, I agree with your observation. You might be interested in a recent Quanta Magazine article about randomness and two quotes in particular pasted below. I also believe randomness in numerical linear algebra specifically is over-hyped and unnecessary unless the originating problem (say the continuous one) is naturally random.
1. “But in two papers published in the 1990s, Wigderson and his collaborators proved that under certain assumptions, it’s always possible to convert a fast random algorithm into a fast deterministic algorithm.”
2. “I think that today almost anybody you asked would tell you randomness is weak rather than that randomness is powerful, because, under assumptions we strongly believe, randomness can be eliminated,” said Wigderson.