Subhashri Manohar, University of Bozen, Bolzano, Italy.
A Comparison of Gradient Descent and Stochastic Gradient Descent Algorithms
In the current decade of Machine Learning, Neural networks are the most sought after method for its ability to solve complex problems with high accuracy. Now, the accurate results is mainly a result of the optimization algorithm contributing to reduce the errors. There are numerous optimization strategies used to produce slightly better and faster results. In this presentation we are going to discuss about the most commonly used algorithms: Standard Gradient Descent and Stochastic Gradient Descent. They are executed on Convolutional Neural Network using MNIST dataset. The results are evaluated on how well the algorithms generalize, accuracy, convergence time and error. Based on that we discuss their advantages and applications.