Computational & Technology Resources
an online resource for computational,
engineering & technology publications
Civil-Comp Proceedings
ISSN 1759-3433
CCP: 79
Edited by: B.H.V. Topping and C.A. Mota Soares
Paper 113

Sampling Techniques for Sequential Kriging Metamodels in Robust Design Optimisation

F. Jurecka+, M. Ganser* and K.-U. Bletzinger+

+Chair of Structural Analysis, Technische Universität München, Munich, Germany
*Research and Innovation Center, BMW Group, Munich, Germany

Full Bibliographic Reference for this paper
F. Jurecka, M. Ganser, K.-U. Bletzinger, "Sampling Techniques for Sequential Kriging Metamodels in Robust Design Optimisation", in B.H.V. Topping, C.A. Mota Soares, (Editors), "Proceedings of the Seventh International Conference on Computational Structures Technology", Civil-Comp Press, Stirlingshire, UK, Paper 113, 2004. doi:10.4203/ccp.79.113
Keywords: optimisation, robust design, design of experiments, Latin hypercube, sampling, surrogate model, metamodel, kriging.

Probabilistic design optimisation to account for variability of design variables and environmental parameters in structural optimisation has been discussed within various communities heading for different goals. In this context optimisation of structures with the goal of a robust design means to optimise (minimise, maximise or bring on target) the objective function of stochastically varying variables.

In addition to this original formulation of robust parameter design by Taguchi [1] several other approaches based on decision theoretic formulations have been introduced to attain a robust design, e.g. the minimax principle which minimises the worst case effected by variability and the Bayes principle to optimise the expectation of the objective. Both approaches require a lot of function evaluations either to find the worst case or to calculate numerically the integral of the expectation.

In cases where a large number of function evaluations is prohibitive, e.g. extensive and time-consuming computer simulations, an optimisation method using metamodels is suggested. Within this framework a surrogate model of the objective function is constructed using the results of a finite element analysis at selected sampling points [2]. On such a metamodel evaluations which represent estimates for the true function are very cheap to get and thus the above mentioned criteria can easily be evaluated.

In this paper a spatial correlation method called kriging is used as metamodelling technique. Originating in geostatistcs it has emerged as well suited for the analysis of computer experiments as it is able to interpolate the output of the simulations. This is an important feature regarding the fact that deterministic computer simulations are not subject to random error: equal input parameters yield equal responses up to floating-point precision. Hence we expect our metamodel to represent the output data exactly at the sampled points [3].

The selection of the sampling points used to build the surrogate model can be implemented via classical design of experiments (DOE) including the crossed or combined array. In the context of kriging models Latin hypercube designs have been widely used for instance because of their space filling property. This assures a balanced predictive performance of the kriging model throughout the investigated design space.

Using the procedure presented above results in an estimate for the robust design. Verifying this design via the underlying computer simulation will show whether the claimed accuracy of the metamodel compared to the computer code is met or further improvements of the surrogate model are necessary. Regarding the latter case we present an approach for the selection of additional sampling points to sequentially augment the significance of the metamodel with the main focus on robust design optimisation.

In this paper we adapt an update criterion for successive kriging metamodels, the generalised expected improvement citerion as suggested by Schonlau in [4]. In this approach the next point is chosen to maximise a criterion that balances searching where the predicted value of the objective function is improved with searching where the uncertainty of prediction is large.

In contrast to the referenced literature we split the update into two parts, one deals with the design space and the other addresses the noise. Consequently, using the generalised expected improvement criterion we find a set of coordinates for the design variables which is most promising for further runs of the expensive computer simulation. The so sampled point will enhance the significance of the model and might yield a new optimum. In the noise space the new input settings are determined depending on the robustness principle that is used: for the minimax principle one can use the criterion introduced by Schonlau and for integral type principles, e.g. the Bayes principle, a maximisation of the model uncertainty is suggested. Together the values for the design and noise form the input vector for a new computer simulation which will now be launched. Based on the extended set of sampling points and their response values an improved metamodel is built. This sequence is repeated until convergence is achieved.

G. Taguchi, "Introduction to quality engineering", Asian Productivity Organisation, Tokyo 1986.
T. W. Simpson, J. D. Peplinski, P. N. Koch and J. K. Allen, "Metamodels for Computer-based Engineering Design: Survey and recommendations", Engineering with Computers, 17:129-150, 2001. doi:10.1007/PL00007198
J. Sacks, W. J. Welch, T. J. Mitchell and H. P. Wynn, "Design and Analysis of Computer Experiments", Statistical Science, 4(4):409-435, 1989. doi:10.1214/ss/1177012413
M. Schonlau, "Computer Experiments and Global Optimization", PhD Thesis, University of Waterloo, 1997.

purchase the full-text of this paper (price £20)

go to the previous paper
go to the next paper
return to the table of contents
return to the book description
purchase this book (price £135 +P&P)