Advanced course search
This course looks at the minimization of smooth functions of several variables. The first part of the course examines gradient based methods using line searches, including Newton, quasi-Newton, and conjugate gradient methods. A selection of other topics is then introduced, including trust region methods and methods for constrained optimization. Demonstration software is used to illustrate aspects of various algorithms in practice.Topics:• Gradient based methods: steepest descent, conjugate gradients, Newton's method and quasi-Newton methods. Line searches and trust regions.• Constrained optimization: Karush-Kuhn-Tucker conditions, quadratic penalty functions, augmented Lagrangians.• Derivative free methods: positive bases, Clarke's generalized derivative, frames.
Subject to approval of the Head of Department.
Chris Price (MATH)
Recommended Texts:• Numerical Optimization, Nocedal and Wright (2006).• Practical Methods of Optimisation, Fletcher (1987).• Practical Optimization, Gill, Murray, and Wright (1981).
Mathematics and Statistics Honours Booklet
Domestic fee $788.00
International fee $3,588.00
For further information see
Mathematics and Statistics.