Newtonian Method (Optimizing Two Variable Functions)

The algorithm summarizes Newton's Method.
1.8K Downloads
Updated 13 Mar 2017

View License

Newton's method uses information from the Hessian and the Gradient i.e. convexity and slope to compute optimum points. For most quadratic functions it returns the optimum value in just a single search or 2 iterations which is even faster than Conjugate Gradient method. This can be verified by comparing the results with Conjugate Gradient algorithm previously posted by me. However, in some cases for higher order or non-quadratic functions the method might diverge or it may converge to non-minimum stationary points. To guarantee convergence at minima often pre-conditioners are used. The pre-conditioners limit step size increasing the number of computations, but ensuring minimum solution.

Cite As

Soumitra Sitole (2024). Newtonian Method (Optimizing Two Variable Functions) (https://www.mathworks.com/matlabcentral/fileexchange/62012-newtonian-method-optimizing-two-variable-functions), MATLAB Central File Exchange. Retrieved .

MATLAB Release Compatibility
Created with R2016b
Compatible with any release
Platform Compatibility
Windows macOS Linux

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!
Version Published Release Notes
1.0.0.0

Update includes the m file.