General purpose optimization algorithms.
- Steepest descent (Wolfe condition);
- Newton direction (possibly with Hessian modification);
- Trust region;
- Congjugate gradeint (Fletcher-Reeves);
- Quasi-Newton method - BFGS;
- Quasi-Newton method - SR1 with trust region;
- Newton method with conjugate gradient.
PS: More algorithms are expected to be added.