Computational & Technology Resources
an online resource for computational,
engineering & technology publications
Civil-Comp Proceedings
ISSN 1759-3433
CCP: 89
PROCEEDINGS OF THE SIXTH INTERNATIONAL CONFERENCE ON ENGINEERING COMPUTATIONAL TECHNOLOGY
Edited by: M. Papadrakakis and B.H.V. Topping
Paper 134

Bayesian Emulators and the Stochastic Finite Element Method

F.A. Díaz De la O and S. Adhikari

School of Engineering, Swansea University, United Kingdom

Full Bibliographic Reference for this paper
, "Bayesian Emulators and the Stochastic Finite Element Method", in M. Papadrakakis, B.H.V. Topping, (Editors), "Proceedings of the Sixth International Conference on Engineering Computational Technology", Civil-Comp Press, Stirlingshire, UK, Paper 134, 2008. doi:10.4203/ccp.89.134
Keywords: stochastic finite element method, stochastic partial differential equations, Karhunen-Loève expansion, gaussian stochastic process, Monte Carlo simulation, bayesian statistics.

Summary
This paper proposes an approach for efficiently representing realizations of computationally expensive random fields, in the context of the stochastic finite element method (SFEM). The strategy, known as emulation, consists of building a statistical approximation of realizations of such random fields. It is based on few runs of a code that performs the discretisation of the random field via the Karhunen-Loève expansion.

Uncertainty quantification is unavoidable in the modeling and prediction of engineering systems, if such modeling is to have credibility. Uncertainty can be incorporated into the partial differential equations that govern the system's response. However, when a system with a large number of degrees of freedom is investigated, running an uncertainty quantification code can become prohibitively expensive. Such codes, as well as the underlying mathematical models are called simulators. Several methods designed to reduce the execution cost of an expensive simulator already exist, such as local polynomial regression and neural networks. Another possible approach is the Bayesian analysis of computer code outputs [1]. This technology is based on the analysis and design of computer experiments [2] and uses concepts of Bayesian statistics. It essentially consists of constructing an approximation to the simulator, called an emulator. More precisely, an emulator is a statistical approximation to the simulator, in the sense that it provides a probability distribution for it. The main idea behind emulation is the following: Given some prior beliefs about the simulator, an initial set of runs is treated as training data that will be used to update such beliefs. This prior information will be modeled as a Gaussian stochastic process. The number of training runs will be small relative to the size of the input domain of the simulator, since by assumption it is computer intensive. Upon updating, the emulator will interpolate and extrapolate the available data at unsampled inputs, whereas it will return the known value of the simulator at each of the initial runs.

A realization of a Gaussian homogeneous two-dimensional random field is emulated, using the information provided by a set of training runs a tenth of the size of the training set. The exercise is repeated, increasing the number of points to be evaluated. Good agreement between the original and the emulated values will be obtained. More importantly, the computational time of emulation will be shown to be less than the direct evaluation of the random field.

References
1
O'Hagan A., "Bayesian analysis of computer code outputs: A tutorial", Reliability Engineering & System Safety, Vol. 91, No. 10-11, pp. 1290-1300, 2006. doi:10.1016/j.ress.2005.11.025
2
Satner T., Williams B., Notz W., "The Design and Analysis of Computer Experiments", Springer Series in Statistics, London, UK, 2003.

purchase the full-text of this paper (price £20)

go to the previous paper
go to the next paper
return to the table of contents
return to the book description
purchase this book (price £95 +P&P)