Affordable Access

Analysing different optimization algorithms for training of neural networks

Authors
  • Nandi, Arnab
  • Simeonidis, Alexandros
Publication Date
Jan 01, 2021
Source
Cairn
Keywords
Language
English
License
Green
External links

Abstract

This thesis analyses four different optimization algorithms for training a convolutional neural network (CNN) using three different datasets. The algorithms studied were stochastic gradient descent (SGD), Polyak momentum, Nesterov momentum and Adaptive Moment Estimation (ADAM). The data sets used to train the algorithms were two image based datasets called Fashion MNIST and Cifar-10 as well as a multivariate data set containing information about flower species called Iris. The algorithm that reached the highest accuracy across all three data sets was the Adaptive Moment Estimation, therefore making it the most suitable algorithm for these datasets.

Report this publication

Statistics

Seen <100 times