For more than two decades David Donoho has been a leading figure in mathematical statistics. His introduction of novel mathematical tools and ideas has helped shape both the theoretical and applied sides of modern statistics. His work is characterized by the development of fast computational algorithms together with rigorous mathematical analysis for a wide range of statistical and engineering problems.
A central problem in statistics is to devise optimal and efficient methods for estimating (possibly non-smooth) functions based on observed data which has been polluted by (often unknown) noise. Optimality here means that, as the sample size increases, the error in the estimation should decrease as fast as that for an optimal interpolation of the underlying function. The widely used least square regression method is known to be non-optimal for many classes of functions and noise that are encountered in important applications, for example non-smooth functions and non-Gaussian noise. Together with Iain Johnstone, Donoho developed provably almost optimal (that is, up to a factor of a power of the logarithm of the sample size) algorithms for function estimation in wavelet bases. Their “soft thresholding” algorithm is now one of the most widely used algorithms in statistical applications.