SESOP-TN: Combining Sequential Subspace Optimization with Truncated Newton method
SESOP-TN is a method for very large scale unconstrained optimization of smooth functions. It combines ideas of Sequential Subspace Optimization (SESOP) [1, 4] with those of the Truncated Newton (TN) method . Replacing TN line search with subspace optimization, we allow Conjugate Gradient (CG) iterations to stay matched through consequent TN steps. This resolves the problem of TN sensitivity to an early break of the CG process. For example, when an objective function is quadratic, the SESOP-TN trajectory coincides with the trajectory of CG as applied directly to the objective. Standard TN lacks this property and converges more slowly. Numerical experiments illustrate the effectiveness of the method.
SESOP-TN MATLAB tool implements the following methods:
Brief description of SESOP-TN algorithm
We minimize a function f(x), which has gradient g(x) and Hessian H(x). Note that the Hessian is not computed explicitly: just Hessian-vector products are used.
Iteration of SESOP-TN
TN step: solve
2. Optimize f(x) over affine subspace spanned by:
3. Goto Step 1 (TN), feeding the resulting subspace optimization step into CG starting condition.
Presented procedure allows resolving the typical TN problem, related to the early break of CG process. For example, when f(x) is quadratic, SESOP-TN trajectory is the same as one of CG applied to f(x). Standard TN would converge slower in this case.
Note that subspace optimization step 2 can be carried very efficiently, if f(x)= f1(Ax)+f2(x), where Ax is computationally heavy, and f1(.) and f2(.) are inexpensive (see  for the details).
My collaborators at various stages of the work on SESOP and its applications