Use the Tab and Up, Down arrow keys to select menu items.
Practical and theoretical aspects of the design and development of algorithms for the optimisation of functions of several variables.
This course looks at the minimization of smooth functions of several variables. The first part of the course examines gradient based methods using line searches, including Newton, quasi-Newton, and conjugate gradient methods. A selection of other topics is then introduced, including trust region methods and methods for constrained optimization. Demonstration software is used to illustrate aspects of various algorithms in practice.Topics:• Gradient based methods: steepest descent, conjugate gradients, Newton's method and quasi-Newton methods. Line searches and trust regions.• Constrained optimization: Karush-Kuhn-Tucker conditions, quadratic penalty functions, augmented Lagrangians.• Derivative free methods: positive bases, Clarke's generalized derivative, frames.
Subject to approval of the Head of Department.
Christopher Price
Recommended Texts:• Numerical Optimization, Nocedal and Wright (2006).• Practical Methods of Optimisation, Fletcher (1987).• Practical Optimization, Gill, Murray, and Wright (1981).
MATH412 Homepage Mathematics and Statistics Honours Booklet
Domestic fee $703.00
International Postgraduate fees
* All fees are inclusive of NZ GST or any equivalent overseas tax, and do not include any programme level discount or additional course-related expenses.
This course will not be offered if fewer than 5 people apply to enrol.
For further information see Mathematics and Statistics .