[1] Cressie N. Statistics for spatial data[M]. New York: Wiley, 1993.[2] Rasmussen C E, Williams C K I. Gaussian processes for machine learning[M]. Cambridge: MIT Press, 2006.[3] Santner T J, Williams B J, Notz W I. The design and analysis of computer experiments[M]. New York: Springer, 2003.[4] Boyle P K, Frean M R. Dependent Gaussian processes[C]//Advances in Neural Information Processing Systems. Cambrige, MIT Press, 2004: 17.[5] Andriluka M, Weizsäcker L, Hofmann T. Multi-class classification with dependent Gaussian processes: technical report[R/OL]. 2008[2013-07-15].http://www.gkmm.informatik.tu-darmstadt.de/publications/files/andriluka07asmda.pdf.[6] Kennedy M C, O’Hagan A. Predicting the output from a complex computer code when fast approximations are available[J]. Biometrika, 2000, 87: 1-13.[7] Qian P Z G, Wu C F J. Bayesian hierarchical modeling for integrating low-accuracy and high-accuracy experiments[J]. Technometrics, 2008, 50: 192-204.[8] Higdon D, Gattiker J, Williams B, et al. Computer model calibration using high-dimensional output[J]. Journal of the American Statistical Association, 2008, 103: 570-583.[9] Qian P Z G, Wu H, Wu C F J. Gaussian process models for computer experiments with qualitative and quantitative factors[J]. Technometrics, 2008, 50: 383-396.[10] Han G, Santner T J, Notz W I, et al. Prediction for computer experiments having quantitative and qualitative input variables[J]. Technometrics, 2009, 51: 278-288.[11] Zhou Q, Qian P Z G, Zhou S. A simple approach to emulation for computer models with qualitative and quantitative factors[J]. Technometrics, 2011, 53: 266-273.[12] Bayarri M J, Berger J O, Cafeo J, et al. Computer model validation with functional output[J]. Annals of Statistics, 2007, 35: 1 874-1 906. |