Computational & Technology Resources
an online resource for computational,
engineering & technology publications 

CivilComp Proceedings
ISSN 17593433 CCP: 84
PROCEEDINGS OF THE FIFTH INTERNATIONAL CONFERENCE ON ENGINEERING COMPUTATIONAL TECHNOLOGY Edited by: B.H.V. Topping, G. Montero and R. Montenegro
Paper 89
Inexact Constraint Preconditioners for Optimization Problems L. Bergamaschi^{1}, M. Venturin^{2} and G. Zilli^{1}
^{1}Department of Mathematical Methods and Models for Scientific Applications,
L. Bergamaschi, M. Venturin, G. Zilli, "Inexact Constraint Preconditioners for Optimization Problems", in B.H.V. Topping, G. Montero, R. Montenegro, (Editors), "Proceedings of the Fifth International Conference on Engineering Computational Technology", CivilComp Press, Stirlingshire, UK, Paper 89, 2006. doi:10.4203/ccp.84.89
Keywords: interiorpoint methods, iterative solvers, preconditioners, approximate Jacobian.
Summary
In this paper we are concerned with the solution of large scale minimization problems subject to equality constraints,
via interior point methods with iterative solvers.
When minimization problems subject
to equality constraints are considered, each iteration of an interior point
method requires the following linear system of equations
to be solved
where is the Hessian of Lagrangian and is the Jacobian of constraints. The matrix arising in interior point applications has form , where diagonal scaling matrix with strictly positive elements (due to the barrier terms for primal variables). For linear programs In general we assume that is positive semidefinite (which implies positive definite) and A has full row rank. There has been growing interest in recent years in the use of iterative methods to solve the linear system (77). Although they involve significantly sparser factorizations than those used in direct approaches they still capture most of the numerical properties of the preconditioned system requiring less memory resources.
A variety of preconditioners have been proposed for such matrices.
They have a common feature of constructing
the approximation to (77) by simplifying its upper
left block, namely by applying the preconditioner of the form
where D is a sparse symmetric positive definite approximation to This is a well studied problem with many possibilities [2,4]. Usual choices for matrix D are to keep it blockdiagonal or diagonal. The latter situation has been studied for example in [2], where diag. The Hessian matrix cannot be approximated by anything simpler than a diagonal matrix. Consequently, the factorization of (with a diagonal D) determines the least expensive constraint preconditioner among those preconditioners which "respect" constraints of the optimization problem. However, in certain situations such a preconditioner is still too expensive to compute. In this paper we look for a further simplification of the preconditioner: we replace the Jacobian matrix A with a (sparse) approximation . In the same direction, incomplete preconditioners are also used [3,1].
We propose to use the preconditioner of the following form
where D is an approximation of and is an approximation of A. We extend the analysis of [2,1] to this case and provide a complete characterisation of the spectrum of . Moreover, we give some sufficient condition on the norm of to guarantee that has maximum rank. Dropping some of the elements in the Jacobian matrix A produces a significant reduction of the fillin of the Cholesky factor of the preconditioner thus speedingup the cost of a single iteration of the Krylov subspace method of choice. The spectral analysis of the preconditioned matrix reveals that a large number of eigenvalues are one or positive and bounded by those of . The distance of the remaining eigenvalues form unity is proven to be bounded in terms of the norm of the dropping matrix E. Some numerical results for a number of large quadratic problems demonstrate that the new approach is an attractive alternative for direct approach and for exact constraint preconditioners. References
purchase the fulltext of this paper (price £20)
go to the previous paper 
