Numerical Optimization for Large Scale Problems
Implementation and benchmarking of Modified and Truncated Newton methods for second-order optimization
Project Description
Final project for the Numerical Optimization for Large Scale Problems course at Politecnico di Torino (January 2026). The project implements and compares two advanced second-order optimization algorithms, benchmarking them on large-scale problems up to dimension N = 100,000.
Algorithms Implemented
- Modified Newton Method: Uses Modified Cholesky Factorization to ensure descent in non-convex regions.
- Truncated Newton (Newton-CG): Approximates the Newton step using Conjugate Gradient — no exact Hessian factorization needed.
Both methods are globalized using a Backtracking Line Search strategy.
Test Problems
- Broyden Tridiagonal Function: Symmetric, pentadiagonal Hessian (bandwidth b=2).
- Banded Trigonometric Function: Diagonal Hessian after algebraic decoupling.
Key Optimizations
- Sparse banded matrix operations (O(N) complexity instead of O(N³)).
- Dynamic τ-perturbation to maintain positive-definiteness of Hessian.
- Adaptive forcing sequence ηₖ for convergence near the optimum.
- Graph Coloring (Stride Strategy) for O(1) exact Hessian computation.
Key Findings
- Modified Newton outperforms on highly structured sparsity — Broyden at N=100,000 solved in ≈ 0.45 seconds.
- Truncated Newton showed numerical instability on the Trigonometric problem due to negative curvature regions.
- Vectorized Finite Differences introduced negligible overhead with h=10⁻⁸ tuning.
Technologies Used
Language: Python 3
Libraries: NumPy, SciPy (sparse, linalg), Matplotlib
Techniques: Modified Cholesky Factorization, Conjugate Gradient, Backtracking Line Search, Graph Coloring, Finite Differences
Authors: Lucio Baiocchi (s360244), Leonardo Passafiume (s358616) — Politecnico di Torino, Jan 2026
Link repo: GitHub