Use the Tab and Up, Down arrow keys to select menu items.
Techniques for optimising smooth functions both with and without constraints present.
This course looks at unconstrained and constrained local minimisation of functions of several variables. The focus is largely on gradient based methods including steepest descent, Newton, quasi-Newton and conjugate gradient methods.Constrained methods include augmented Lagrangian and sequential quadratic programming techniques. Direct search methods in global optimisation will also be looked at. These have the advantage of not relying on the availability or even existence of derivatives, at the cost of a reduction in speed.
Subject to approval of the Head of School.
Students must attend one activity from each section.
School of Mathematics and Statistics Postgraduate HandbookGeneral information for studentsLibrary portal
Domestic fee $989.00
International Postgraduate fees
* Fees include New Zealand GST and do not include any programme level discount or additional course related expenses.
For further information see
Mathematics and Statistics.