Abstract :
Summary form only given, as follows. Commenting on the development of statistics early in the 20th century, the UCLA historian Theodore Porter wrote that "the foundations of mathematical statistics were laid between 1890 and 1930", and argued that "the principal families of techniques for analyzing numerical data were established during the same period." There was a revolution in quantitative data analysis in the early part of last century, leading to the development of the subject we know today as statistics. And at the time Porter wrote, in 1986, he would also have been correct in his second assertion. However, it would be difficult to justify the same remarks today. The speed and memory of computers have increased one thousand fold since 1986, and the second revolution in statistics, certainly motivated and perhaps driven by developments in computing, has begun to fundamentally change statistical methodology. It is a long way from running its course. Over the next few decades it will transform the subject into something that is quite different, in terms of its range and the emphases on types of problems that it treats, from that which we know today. If the development of statistics had taken place in the environment of contemporary advances in computing then the subject would most likely be less mathematical, and more of an experimental science, then it is today. The present talk discusses some of the changes, in areas of resampling and Monte Carlo methods, and outlines new directions for at least the near future