Computational & Technology Resources
an online resource for computational,
engineering & technology publications
Civil-Comp Proceedings
ISSN 1759-3433
CCP: 94
PROCEEDINGS OF THE SEVENTH INTERNATIONAL CONFERENCE ON ENGINEERING COMPUTATIONAL TECHNOLOGY
Edited by: B.H.V. Topping, J.M. Adam, F.J. Pallarés, R. Bru and M.L. Romero
Paper 34

Inexact Jacobian Constraint Preconditioners in Optimization

L. Bergamaschi1, M. Venturin2 and G. Zilli1

1Department of Mathematical Methods and Models for Scientific Applications, University of Padua, Italy
2Department of Computer Science, University of Verona, Italy

Full Bibliographic Reference for this paper
L. Bergamaschi, M. Venturin, G. Zilli, "Inexact Jacobian Constraint Preconditioners in Optimization", in B.H.V. Topping, J.M. Adam, F.J. Pallarés, R. Bru, M.L. Romero, (Editors), "Proceedings of the Seventh International Conference on Engineering Computational Technology", Civil-Comp Press, Stirlingshire, UK, Paper 34, 2010. doi:10.4203/ccp.94.34
Keywords: interior-point methods, iterative solvers, preconditioners, approximate Jacobian.

Summary
In this paper we are concerned with the solution of large scale minimization problems subject to equality constraints, via interior point methods with iterative solvers. When minimization problems subject to equality constraints are considered, each iteration k of an interior point method requires the solution of an indefinite linear system of equations of the saddle point type.

Analogous to the well-known constraint preconditioners we propose to solve these linear systems using Krylov subspace methods accelerated by preconditioners having the same block structure as the original matrix. Usually in the constraint preconditioners only the (1,1) block (Hessian of the Lagrangian) is approximated while the others are kept unchanged.

In this paper we propose to approximate the (1,1) block (Hessian of the Lagrangian) by its diagonal and also to approximate the off-diagonal blocks (namely the Jacobian matrix and its transpose). This idea follows that reported in [1,2] where the Jacobian matrix is sparsified by a drop tolerance based technique. In this paper, instead, we compute selectively the Jacobian approximation by dropping dynamically the elements from the ith row of the Jacobian matrix depending on the magnitude of the corresponding element of the diagonal matrix approximating the Hessian.

Dropping some of the elements in the Jacobian matrix A produces a significant reduction of the fill-in of the Cholesky factor of the preconditioner, thus speeding-up the cost of a single iteration of the Krylov subspace method of choice.

A spectral analysis is performed, which shows that the eigenvalues of the preconditioned matrix are bounded in terms of the norm of two matrices that measure the error introduced by the matrix approximations.

Some numerical results for a number of large quadratic problems demonstrate that the new approach is an attractive alternative to the direct approach and to exact constraint preconditioners.

References
1
L. Bergamaschi, J. Gondzio, M. Venturin, G. Zilli, "Inexact constraint preconditioners for linear systems arising in interior point methods", Comput. Optim. Appl., 36, 136-147, 2007. doi:10.1007/s10589-006-9001-0
2
L. Bergamaschi, M. Venturin, G. Zilli, "Inexact Constraint Preconditioners for Optimization Problems", in B.H.V. Topping, G. Montero, R. Montenegro, (Editors), "Proceedings of the Fifth International Conference on Engineering Computational Technology", Civil-Comp Press, Stirlingshire, UK, Paper 89, 2006. doi:10.4203/ccp.84.89

purchase the full-text of this paper (price £20)

go to the previous paper
go to the next paper
return to the table of contents
return to the book description