Use the Tab and Up, Down arrow keys to select menu items.
This course looks at the minimization of smooth functions of several variables. The first part of the course examines gradient based methods using line searches, including Newton, quasi-Newton, and conjugate gradient methods. A selection of other topics is then introduced, including trust region methods and methods for constrained optimization. Demonstration software is used to illustrate aspects of various algorithms in practice.Topics:• Gradient based methods: steepest descent, conjugate gradients, Newton's method and quasi-Newton methods. Line searches and trust regions.• Constrained optimization: Karush-Kuhn-Tucker conditions, quadratic penalty functions, augmented Lagrangians.• Derivative free methods: positive bases, Clarke's generalized derivative, frames.
This course will provide students with an opportunity to develop the Graduate Attributes specified below:
Critically competent in a core academic discipline of their award
Students know and can critically evaluate and, where applicable, apply this knowledge to topics/issues within their majoring subject.
Subject to approval of the Head of Department.
Chris Price (MATH)
Recommended Texts:• Numerical Optimization, Nocedal and Wright (2006).• Practical Methods of Optimisation, Fletcher (1987).• Practical Optimization, Gill, Murray, and Wright (1981).
Mathematics and Statistics Honours Booklet
Domestic fee $788.00
International Postgraduate fees
* Fees include New Zealand GST and do not include any programme level discount or additional course related expenses.
For further information see
Mathematics and Statistics.