Computational & Technology Resources
an online resource for computational,
engineering & technology publications
Civil-Comp Proceedings
ISSN 1759-3433
CCP: 93
PROCEEDINGS OF THE TENTH INTERNATIONAL CONFERENCE ON COMPUTATIONAL STRUCTURES TECHNOLOGY
Edited by:
Paper 209

Convergence of Successive Regression Approximations for Solving Noisy Equations

I. Deák

Department of Computer Science, Corvinus University of Budapest, Hungary

Full Bibliographic Reference for this paper
, "Convergence of Successive Regression Approximations for Solving Noisy Equations", in , (Editors), "Proceedings of the Tenth International Conference on Computational Structures Technology", Civil-Comp Press, Stirlingshire, UK, Paper 209, 2010. doi:10.4203/ccp.93.209
Keywords: equation solving, noisy functions, convergence, optimization.

Summary
The basic problem we consider in this paper is finding the root of a one-dimensional function, when noisy function values are available only. The iterative method we developed to solve this problem is called successive regression approximations (SRA), and it produces a sequence of points, converging to the root. As the name indicates the method relies on a series of regression approximations, that are updated at each step of the iterative procedure. The n-dimensional version of the method proved to be quite succesfull in solving different types of stochastic programming problems including problems with probability constraint, two-stage problems and mixed stochastic problems, according to extensive numerical computations [1,2].

The convergence of the basic method for solving one dimensional root-finding problems when exact function values are available has been proven in [3]. In this paper the stochastic convergence of the points, produced by SRA is proved in case of noisy function values.

First we have shown that the steplength (the distance of the consecutive points) in our algorithm converges to zero, no matter what kind of point sequence is produced by the algorithm. Then we used a simple random walk (with discrete parameter space and continuous state space) to construct two composite random processes, bracketing the point sequence generated by our algorithm. Since the two composite random processes were ergodic, thus the bracketed process also posesses this property, by which the stochastic convergence of the point sequence generated by SRA to the root could be easily shown.

Successively applying regression functions is a natural extension of the celebrated result of Robbins and Monro [4]. SRA is a simple extension of the so called Response Surface Methodology [5]: we use all previous points and function values. The main conclusion of this paper is, that using successively updated regression approximations leads to a sequence of convergent points.

References
1
I. Deák, "Solving stochastic programming problems by successive regression approximations - Numerical results", in K. Marti, Y. Ermoliev, G. Pflug, (Editors), "Dynamic Stochastic Optimization", Springer LNEMS V., 532, 209-224, 2003.
2
I. Deák, "Testing successive regression approximations by large-scale two-stage problems", Annals of Operations Research (Proc. Xth Int. Conf. on Stochastic Programming, Tucson, AZ, 2004. to appear), 2010.
3
I. Deák, "Successive regression approximations for solving equations", Pure Mathematics and Applications, 12, 25-50, 2001.
4
H. Robbins, S. Monro, "A stochastic approximation method", Ann. Math. Stat., 22, 400-407, 1951. doi:10.1214/aoms/1177729586
5
G.E.P. Box, N.R. Draper, "Empirical model-building and response surface", J.Wiley and Sons, 1987.

purchase the full-text of this paper (price £20)

go to the previous paper
go to the next paper
return to the table of contents
return to the book description
purchase this book (price £145 +P&P)