Computational & Technology Resources
an online resource for computational,
engineering & technology publications
PROCEEDINGS OF THE SEVENTH INTERNATIONAL CONFERENCE ON ENGINEERING COMPUTATIONAL TECHNOLOGY
Edited by: B.H.V. Topping, J.M. Adam, F.J. Pallarés, R. Bru and M.L. Romero
Inexact Jacobian Constraint Preconditioners in Optimization
L. Bergamaschi1, M. Venturin2 and G. Zilli1
1Department of Mathematical Methods and Models for Scientific Applications, University of Padua, Italy
L. Bergamaschi, M. Venturin, G. Zilli, "Inexact Jacobian Constraint Preconditioners in Optimization", in B.H.V. Topping, J.M. Adam, F.J. Pallarés, R. Bru, M.L. Romero, (Editors), "Proceedings of the Seventh International Conference on Engineering Computational Technology", Civil-Comp Press, Stirlingshire, UK, Paper 34, 2010. doi:10.4203/ccp.94.34
Keywords: interior-point methods, iterative solvers, preconditioners, approximate Jacobian.
In this paper we are concerned with the solution of large scale minimization problems subject to equality constraints, via interior point methods with iterative solvers. When minimization problems subject to equality constraints are considered, each iteration k of an interior point method requires the solution of an indefinite linear system of equations of the saddle point type.
Analogous to the well-known constraint preconditioners we propose to solve these linear systems using Krylov subspace methods accelerated by preconditioners having the same block structure as the original matrix. Usually in the constraint preconditioners only the (1,1) block (Hessian of the Lagrangian) is approximated while the others are kept unchanged.
In this paper we propose to approximate the (1,1) block (Hessian of the Lagrangian) by its diagonal and also to approximate the off-diagonal blocks (namely the Jacobian matrix and its transpose). This idea follows that reported in [1,2] where the Jacobian matrix is sparsified by a drop tolerance based technique. In this paper, instead, we compute selectively the Jacobian approximation by dropping dynamically the elements from the ith row of the Jacobian matrix depending on the magnitude of the corresponding element of the diagonal matrix approximating the Hessian.
Dropping some of the elements in the Jacobian matrix A produces a significant reduction of the fill-in of the Cholesky factor of the preconditioner, thus speeding-up the cost of a single iteration of the Krylov subspace method of choice.
A spectral analysis is performed, which shows that the eigenvalues of the preconditioned matrix are bounded in terms of the norm of two matrices that measure the error introduced by the matrix approximations.
Some numerical results for a number of large quadratic problems demonstrate that the new approach is an attractive alternative to the direct approach and to exact constraint preconditioners.
purchase the full-text of this paper (price £20)