Computational & Technology Resources
an online resource for computational,
engineering & technology publications 

CivilComp Proceedings
ISSN 17593433 CCP: 93
PROCEEDINGS OF THE TENTH INTERNATIONAL CONFERENCE ON COMPUTATIONAL STRUCTURES TECHNOLOGY Edited by:
Paper 174
Propagation of Uncertainties using the Method of Moments M. Beckers^{1} and U. Naumann^{2}
^{1}German Research School for Simulation Sciences, Juelich, Germany
M. Beckers, U. Naumann, "Propagation of Uncertainties using the Method of Moments", in , (Editors), "Proceedings of the Tenth International Conference on Computational Structures Technology", CivilComp Press, Stirlingshire, UK, Paper 174, 2010. doi:10.4203/ccp.93.174
Keywords: uncertainty propagation, algorithmic differentiation, Taylor series, stochastic moments, higherorder derivatives, validation of algorithms.
Summary
Due to the growth in the field of computer simulation the importance of the propagation
of uncertainties contained in such possibly very complex codes is increasing. Such uncertainties
are often based on measurements which serve as the input of the calculation. Measurements in
practice nearly always contain impreciseness. Such imprecision in the inputs of a program,
representing a function y=F(x),
obviously leads to imprecision in its outputs.
The topic of this paper is the quantification of these output uncertainties based on our knowledge of the uncertainties present in the inputs. Therefore we model our input values as a random variable X which induces an output random variable Y=F(X). Based on our knowledge of the distribution of the random variable X we try to approximate the mean and variance of Y. Our approach for this is the method of moments [1], which uses a Taylor expansion of the function F about the mean of the input. Because of the Taylor expansion derivatives of the function F are included in the computations. Since the function F is given by a computer program algorithmic differentiation (AD) [2] is used for the differentiation part. AD is a method which allows the differentiation of programs with machine accuracy, in contrast to finite differences. An introduction to AD is included in the paper. Prior work on the application of AD in the field of uncertainty propagation includes [3,4,5]. We want to use this approach in the context of optimization algorithms, for example Newtons method [6]. A method is developed which allows the differentiation of a Newton optimization in the context of a one dimensional regression problem. This breaks the computation of the sensitivities of the whole optimization from input to output down to an iterative computation of simpler derivatives which are combined exploiting the chain rule of differential calculus. Based on these calculations the variance contained in the result of the optimization can be approximated. This variance can be interpreted as measure of the reliability of the result obtained by Newtons method. The results of this paper build the basis for an extension of this approach to higherdimensional regression and alternative optimization algorithms. References
purchase the fulltext of this paper (price £20)
go to the previous paper 
