| |
Methods defined here:
- __init__(self, g, **kwargs)
- Create a new instance of a PyGltrContext object, representing a
context to solve the quadratic problem
min <g, d> + 1/2 <d, Hd>
s.t ||d|| <= radius
where either the Hessian matrix H or a means to compute
matrix-vector products with H, are to be specified later.
Arguments of initialization are
g the gradient vector
radius the trust-region radius (default: 1.0)
stop_rel the relative stopping tolerance (default: sqrt( eps ))
stop_abs the absolute stopping tolerance (default: 0.0)
prec a function solving preconditioning systems.
If M is a preconditioner, prec(v) returns a solution
to the linear system of equations Mx = v (default: None)
itmax the maximum number of iterations (default: n)
litmax the maximum number of Lanczos iterations on the boundary
(default: n)
ST Use Steihaug-Toint strategy (default: False)
boundary Indicates whether the solution is thought to lie on
the boundary of the trust region (default: False)
equality Require that the solution lie on the boundary (default: False)
fraction Fraction of optimality that is acceptable. A value smaller
that 1.0 results in a correspondingly sub-optimal solution.
(default: 1.0)
See the GLTR spec sheet for more information on these parameters.
Convergence of the iteration takes place as soon as
Norm( Hd + l Md + g ) <= max( Norm( g ) * stop_rel, stop_abs )
where M is a preconditioner
l is an estimate of the Lagrange multipliers
Norm is the M^{-1}-norm
- explicit_solve(self, H)
- Solves the quadratic trust-region subproblem whose data was
specified upon initialization. During the reverse communication
phase, matrix vector products with the Hessian H will be
computed explicitly using the matvec method of the object H.
For instance, if H is an ll_mat, or csr_mat, products will be
evaluated using H.matvec(x,y).
- implicit_solve(self, hessprod)
- Solves the quadratic trust-region subproblem whose data was
specified upon initialization. During the reverse communication
phase, matrix vector products with the Hessian H will be
computed implicitly using the supplied hessprod method.
Given an array v, hessprod must return an array of the same size
containing the result of the multiplication H*v.
For instance, if the problem is from an Ampl model called nlp,
the hessprod method could be
lambda v: nlp.hprod( z, v )
for some multiplier estimates z.
|