Computational & Technology Resources
an online resource for computational,
engineering & technology publications 

CivilComp Proceedings
ISSN 17593433 CCP: 94
PROCEEDINGS OF THE SEVENTH INTERNATIONAL CONFERENCE ON ENGINEERING COMPUTATIONAL TECHNOLOGY Edited by: B.H.V. Topping, J.M. Adam, F.J. Pallarés, R. Bru and M.L. Romero
Paper 34
Inexact Jacobian Constraint Preconditioners in Optimization L. Bergamaschi^{1}, M. Venturin^{2} and G. Zilli^{1}
^{1}Department of Mathematical Methods and Models for Scientific Applications, University of Padua, Italy
L. Bergamaschi, M. Venturin, G. Zilli, "Inexact Jacobian Constraint Preconditioners in Optimization", in B.H.V. Topping, J.M. Adam, F.J. Pallarés, R. Bru, M.L. Romero, (Editors), "Proceedings of the Seventh International Conference on Engineering Computational Technology", CivilComp Press, Stirlingshire, UK, Paper 34, 2010. doi:10.4203/ccp.94.34
Keywords: interiorpoint methods, iterative solvers, preconditioners, approximate Jacobian.
Summary
In this paper we are concerned with the solution of large scale minimization problems subject to equality constraints,
via interior point methods with iterative solvers.
When minimization problems subject to equality constraints are considered, each iteration k of an interior point
method requires the solution of an indefinite linear system of equations
of the saddle point type.
Analogous to the wellknown constraint preconditioners we propose to solve these linear systems using Krylov subspace methods accelerated by preconditioners having the same block structure as the original matrix. Usually in the constraint preconditioners only the (1,1) block (Hessian of the Lagrangian) is approximated while the others are kept unchanged. In this paper we propose to approximate the (1,1) block (Hessian of the Lagrangian) by its diagonal and also to approximate the offdiagonal blocks (namely the Jacobian matrix and its transpose). This idea follows that reported in [1,2] where the Jacobian matrix is sparsified by a drop tolerance based technique. In this paper, instead, we compute selectively the Jacobian approximation by dropping dynamically the elements from the ith row of the Jacobian matrix depending on the magnitude of the corresponding element of the diagonal matrix approximating the Hessian. Dropping some of the elements in the Jacobian matrix A produces a significant reduction of the fillin of the Cholesky factor of the preconditioner, thus speedingup the cost of a single iteration of the Krylov subspace method of choice. A spectral analysis is performed, which shows that the eigenvalues of the preconditioned matrix are bounded in terms of the norm of two matrices that measure the error introduced by the matrix approximations. Some numerical results for a number of large quadratic problems demonstrate that the new approach is an attractive alternative to the direct approach and to exact constraint preconditioners. References
purchase the fulltext of this paper (price £20)
go to the previous paper 
