Second-order optimisation methods such as the Newton method have been known for their fast convergence. However, the high computational cost required for calculating the Hessian matrix and its inverse has hindered the use of the Newton method in neural network optimisation.
Find flyers, registration, and complete event details at https://crawford.anu.edu.au/news-events/events/18992/parallel-solution-hessian-matrix-neural-network-optimisation-case-mixture
Updated: 24 April, 2017/Responsible Officer: Dean, ANU College of Asia & the Pacific/Page Contact: CAP Web Team