BFGS quasi-Newton optimizer
bfgs.RdFinds the optimum using the BFGS algorithm, which approximates the inverse Hessian using gradient information. More efficient than Newton-Raphson when Hessian computation is expensive.
Usage
bfgs(
objective_fn,
params,
max_iter = 1000,
tol = 1e-06,
maximize = FALSE,
line_search_fn = NULL,
verbose = 0
)Arguments
- objective_fn
Function taking list of value parameters, returns scalar
- params
List of value objects (initial parameter values)
- max_iter
Maximum iterations, default 1000
- tol
Convergence tolerance on gradient norm, default 1e-6
- maximize
If TRUE, maximize; if FALSE (default), minimize
- line_search_fn
Line search function (default: backtracking)
- verbose
Print progress every N iterations (0 for silent)
Value
A list containing:
- params
List of value objects at optimum
- value
Objective function value at optimum
- gradient
Gradient at optimum
- inv_hessian
Approximate inverse Hessian
- iterations
Number of iterations performed
- converged
TRUE if gradient norm < tol