**Speaker's Page:** https://www.isye.gatech.edu/users/jeff-wu.

C. F. Jeff Wu is Professor and Coca-Cola Chair in Engineering Statistics at the School of Industrial and Systems Engineering, Georgia Institute of Technology.

He was elected a Member of the National Academy of Engineering (2004), and a Member (Academician) of Academia Sinica (2000). A Fellow of the Institute of Mathematical Statistics (1984), the American Statistical Association (1985), the American Society for Quality (2002), and the Institute for Operations Research and Management Sciences (2009). He received the COPSS (Committee of Presidents of Statistical Societies) Presidents' Award in 1987, which was given to the best researcher under the age of 40 per year and was commissioned by five statistical societies. His other major awards include the 2011 COPSS Fisher Lecture, the 2012 Deming Lecture (plenary lectures during the annual Joint Statistical Meetings), the Shewhart Medal (2008) from ASQ, and the Pan Wenyuan Technology Award (2008). In 2016 he received the (inaugural) Akaike Memorial Lecture Award. He has won numerous other awards, including the Wilcoxon Prize, the Brumbaugh Award, the Jack Youden Prize (twice), and the Honoree of the 2008 Quality and Productivity Research Conference. He was the 1998 P. C. Mahalanobis Memorial Lecturer at the Indian Statistical Institutes and an Einstein Visiting Professor at the Chinese Academy of Sciences (CAS). He is an Honorary Professor at several institutions, including the CAS and National Tsinghua University. He received an honorary doctor (honoris causa) of mathematics at the University of Waterloo in 2008.

***

**Abstract: **

Because of the advances in complex mathematical models and fast computer codes, computer experiments have become popular in engineering and scientific investigations. Statisticians have worked on the design, modeling and computation aspects of computer experiments. Applied mathematicians have approached a closely related class of problem called UQ (uncertainty quantification). Interface between the two approached is made in the talk. Two problems on the statistical side are presented to illustrate this interface.

1. Consider deterministic computer experiments with tuning parameters which determine the accuracy of the numerical algorithm (e.g., mesh density in finite element analysis). To efficiently integrate computer outputs with different tuning parameters, a class of nonstationary Gaussian process models consistent with the knowledge in numerical analysis is proposed to model the integrated output. Estimation is performed by using Bayesian computation. Numerical studies show the advantages of the proposed method over existing methods. A related problem is given to illustrate the interplay between modeling and design. For this and a broader class of models with multi-levels of fidelity, the nested space-filling designs are most suitable. Some examples are given and the underlying mathematics discussed.

2. Calibration parameters in deterministic computer experiments are those attributes that cannot be measured or available in physical experiments or observations. Kennedy-O’Hagan (2001) suggested an approach to estimation by using data from physical experiments and computer simulations. We show that a simplified version of the original KO method leads to asymptotically inconsistent calibration. This calibration inconsistency can be remedied by modifying the original estimation procedure. A novel calibration method, called the L_{2} calibration, is proposed and proven to be consistent and enjoys optimal convergence rate. A numerical example and some mathematical analysis are used to illustrate the source of inconsistency.