Computational & Technology Resources
an online resource for computational,
engineering & technology publications
PROCEEDINGS OF THE SECOND INTERNATIONAL CONFERENCE ON SOFT COMPUTING TECHNOLOGY IN CIVIL, STRUCTURAL AND ENVIRONMENTAL ENGINEERING
Edited by: Y. Tsompanakis, B.H.V. Topping
Optimal Polynomial Regression Models by using a Genetic Algorithm
M. Hofwing, N. Strömberg and M. Tapankov
Department of Mechanical Engineering, University of Jönköping, Sweden
M. Hofwing, N. Strömberg, M. Tapankov, "Optimal Polynomial Regression Models by using a Genetic Algorithm", in Y. Tsompanakis, B.H.V. Topping, (Editors), "Proceedings of the Second International Conference on Soft Computing Technology in Civil, Structural and Environmental Engineering", Civil-Comp Press, Stirlingshire, UK, Paper 39, 2011. doi:10.4203/ccp.97.39
Keywords: regression model, polynomial, genetic algorithm, metamodel, design of experiments, surrogate model.
For a highly nonlinear response the accuracy of a linear or quadratic polynomial regression model is in general poor. Therefore higher order regression models might be used instead. However, a fully expanded high-order regression model cannot usually be modeled, but a model with only some of the potential high-order terms might be enough. Such a regression model is proposed here. For such a model, the determination of which high-order terms that would yield the most accurate regression model is an optimization problem in itself. The aim of this work is to solve that optimization problem in order to find the optimal polynomial regression model. Different regression models are commonly used to approximate the behavior of an unknown response in a given design domain. The regression models are usually obtained from a design of experiments, the corresponding responses and the constitution of the regression model. In this work a new approach is proposed, where the constituents of a polynomial regression model are of arbitrary order. A genetic algorithm is used to find the optimal terms to be included in the so-called optimal polynomial regression model. In the genetic algorithm the genetic operators rank selection, elitism, multi-point crossover and two methods of mutation are applied. One mutation method where single genes are changed randomly and one where segments of genes changes place within the individual is considered. One generation of the genetic algorithm consists of 100 individuals and the algorithm stops when it has reached 1000. The objective for the genetic algorithm is to minimize the sum of squared errors of the predicted responses. Thus, the fitness of the genetic algorithm is the sum of squared errors. In practice the genetic algorithm generates an optimal set of exponents (chromosome) of the design variables for the specified number of terms in the regression model, where each term is a product of a regression coefficient and the design variables. Several example problems are presented to show the performance and accuracy of the optimal polynomial regression model. The examples consider explicit functions as well as practical engineering problems such as finite element models that are approximated using the optimal polynomial regression model. For all examples the optimal polynomial regression model is compared with ordinary regression models. Results show a greatly improved performance for optimal polynomial regression models compared to traditional regression models. Furthermore, the optimal polynomial regression model is also beneficial since its derivatives trivial. This is of great importance for subsequently applied optimization algorithms.
purchase the full-text of this paper (price £20)