Shang-Hua Teng, a University Professor and Seely G. Mudd Professor of Computer Science and Mathematics, has been honored with a Symposium on Theory of Computing (STOC) Test of Time Award. Teng, with Daniel A. Spielman of Yale University, received the award from the ACM Special Interest Group on Algorithms and Computation Theory for a paper on smoothed analysis of algorithms originally presented at the STOC conference in 2001.
In the paradigm-shifting paper, “Smoothed analysis of algorithms: why the simplex algorithm usually takes polynomial time,” Teng and Spielman use the concept of smoothed analysis to give a more realistic understanding of an algorithm’s performance, such as its running time.
The concept helps to explain a long-debated phenomenon: why do some algorithms work better in practice than in theory? Teng and Spielman found that many algorithms, particularly the widely used simplex algorithm for linear programming, work as long as there is noise in the input, because there is usually noise in real-world data.
The study’s findings have been applied to practical algorithms in countless applications, including faster internet communications, deep learning, data mining, differential privacy, game theory, and personalized recommendation systems.
An internationally renowned theoretical computer scientist, Teng’s work has earned him numerous accolades throughout his career. For his work on smoothed analysis of algorithms, Teng previously received the Gödel Prize as well as the Fulkerson Prize, a prestigious honor awarded once every three years by the American Mathematical Society and the Mathematical Optimization Society.
For his work on nearly-linear-time Laplacian solvers, he was again awarded the Gödel Prize in 2015. A Simons Investigator, a Fellow of the Association for Computing Machinery and Society for Industrial and Applied Mathematics, and Alfred P. Sloan Fellow, Teng has been described by the Simons Foundation as “one of the most original theoretical scientists in the world.”
We sat down with Teng to find out why this groundbreaking paper continues to make waves, and how he rediscovered “math for fun” during the pandemic. Answers have been edited for style and clarity.
What were the key findings of this paper? What problem were you trying to solve?
A long-standing challenge in computing, then and now, has been the following: there are many algorithms that work well in practice that do not work well in the worst-case scenario, as measured by the traditional theory of computation.
It has been commonly believed that practical inputs are usually more favorable than worst-case instances. So, Dan Spielman and I were aiming to develop a framework to capture this popular belief and real-world observation to move theory a step towards practice.
Smoothed analysis is our attempt to understand the practical behavior of algorithms. It captures the following: In the real world, inputs have some degree of randomness, noise, imprecision, or uncertainty. Our theory demonstrates that these properties can in fact be helpful to algorithms in practice, because under these conditions, the worst-case scenarios are harder to arise.
How has this area of research changed in the past 20 years? Why do you think it is still relevant today?
During the past 20 years, we entered the age of “big data,” massive networks, and ubiquitous data-driven AI and machine learning techniques. Understanding practical algorithmic behaviors has become crucial in applications ranging from human-machine interactions and pandemic modeling, to drug design, financial planning, climate modeling and more.
Data and models from all these areas continue to have randomness, noise, imprecision and uncertainty, which is the topic of our research. I hope our work will continue to inspire new theoretical models and practical algorithms for vast data-driven applications.
How has this research on smoothed analysis impacted the “real world”?
In computing, algorithms are commonly used in practice before comprehensive theoretical analyses are conducted. In other words, practitioners are usually on the frontiers of methodology development. In this context, Dan and I were more like theoretical physicists, aiming to develop theory to explain and model practical observations.
For example, the first algorithm that we applied the smoothed analysis to—the simplex method for linear programming—was invented in the 1940s for military planning and economic modeling. The simplex method was widely used in industry for optimization, even though in the 1970s, worst-case examples were discovered by mathematicians suggesting that, in traditional computing theory, the simplex method could not be an efficient algorithm. This is the source of the gap between theory and practice in the world of computing.
Over the years, some researchers from operations research, network systems, data mining, and machine learning told me that they used methods inspired by smoothed analysis in their work. Of course, practical algorithmic behaviors are far more complex than what our theory can capture, which is why we and others are continuing to look for ways to develop better theories for practice.
How did you and Professor Spielman meet?
I first met Dan in 1990 when he—then a junior of Yale—gave a seminar at CMU (where I was a PhD student). I was his student host. We then reconnected and became life-long friends at MIT Math department in 1992 when he arrived as a PhD student and I joined as an instructor for the department.
When you were both working on this paper, did you have any idea it would have such an enormous and long-lasting impact?
Twenty years ago, like many in our field, Dan and I recognized the significance of the challenge that motivated our paper: closing the theory-practice gap for algorithms. The simplex method was often mentioned as an example where practical performance defies theoretical prediction. We believed that the theory-practice gap would continue to be a fundamental subject for computing.
We were also encouraged by the responses to our initial work from scientists and researchers, who were closer to practical algorithm design and optimization than we were. Their feedback encouraged us that our steps were meaningful towards capturing practical behaviors of algorithms.
As theoreticians, Dan and I enjoyed the conceptual formulation of smoothed analysis and the technical component of probability, high-dimensional geometry, and mathematical programming in our work. It is exciting to develop a theory that is relevant to some aspect of practice and a great honor indeed to have my work recognized by my peers.
Coming back to the present day, what have you been working on recently? Has the pandemic impacted your research?
During this historical moment, I did find one area of mathematics soothing: recreational mathematics. When I was a student, I used to read Scientific American, and always enjoyed the mathematical puzzles and games in the magazine. When I was teaching at Boston University, one of my PhD students, Kyle Burke, was super passionate and gifted in puzzles and games. He wrote a thesis in 2009 with a cool title: “Science for Fun: New Impartial Board Games.”
Three years ago, he recommended a talented undergraduate, Matt Ferland, to be a PhD student in our department. During the Covid Zoom world, Matt, Kyle and I have been studying several fundamental problems in Combinatorial Game Theory (a more studious name for recreational mathematics), including board games incorporated with quantum-inspired elements.
We also designed new board games based on mathematical and computer science problems. In a recent paper, we solved two long-standing problems in this field that were open since the 1980s and 1990s. These results involve the mathematical extension of the word-chain game we used to play as kids. I have also started playing these games with my 8-year-old daughter. (One of Teng’s games is playable here.)
Published on July 20th, 2021
Last updated on May 16th, 2024