Botev, A;
Lever, G;
Barber, D;
(2017)
Nesterov's accelerated gradient and momentum as approximations to regularised update descent.
In:
Proceedings of the 2017 International Joint Conference on Neural Networks (IJCNN).
IEEE: Anchorage, AK, USA.
Preview |
Text
07966082.pdf - Published Version Download (206kB) | Preview |
Abstract
We present a unifying framework for adapting the update direction in gradient-based iterative optimization methods. As natural special cases we re-derive classical momentum and Nesterov's accelerated gradient method, lending a new intuitive interpretation to the latter algorithm. We show that a new algorithm, which we term Regularised Gradient Descent, can converge more quickly than either Nesterov's algorithm or the classical momentum algorithm.
| Type: | Proceedings paper |
|---|---|
| Title: | Nesterov's accelerated gradient and momentum as approximations to regularised update descent |
| Event: | 2017 International Joint Conference on Neural Networks (IJCNN) |
| Open access status: | An open access version is available from UCL Discovery |
| DOI: | 10.1109/IJCNN.2017.7966082 |
| Publisher version: | https://doi.org/10.1109/IJCNN.2017.7966082 |
| Language: | English |
| Additional information: | This version is the author accepted manuscript. For information on re-use, please refer to the publisher’s terms and conditions. |
| UCL classification: | UCL UCL > Provost and Vice Provost Offices > UCL BEAMS UCL > Provost and Vice Provost Offices > UCL BEAMS > Faculty of Engineering Science UCL > Provost and Vice Provost Offices > UCL BEAMS > Faculty of Engineering Science > Dept of Computer Science |
| URI: | https://discovery.ucl.ac.uk/id/eprint/10062712 |
Archive Staff Only
![]() |
View Item |

