SC Seminar: Angel Adrian Rojas Jimenez

Angel Adrian Rojas Jimenez, TU Kaiserslautern

Title: On the stochastic global optimization and numerical implementation for classification problems

Abstract:

Dynamic search trajectories like B.T. Polyak’s heavy ball method have been implemented in order to speed up the convergence rate to a local minimizer compared to steepest descent method. We focus here on the global optimization aspect and weaken the requirement on the objective to Lipschitz continuity instead of twice continuous differentiability. In this sense, we developed a stochastic version of a variation of the heavy ball method renamed as Savvy Ball method and, previously, referred to as TOAST. We analyze theoretically the non-smooth but convex case where the search trajectory is not obtained by the usual ordinary differential equation but an Ordinary Differential Inclusion (ODI). Finally, we show numerical results for its implementation using neural networks in machine learning classification problems associated to MNIST-digits, CIFAR10, Fashion MNIST dataset. We compare our results with momentum methods like ADAM-optimizer.

How to join:

The talk is held online via Jitsi. You can join with the link https://jitsi.uni-kl.de/SciCompSeminar_11. Please follow the rules below:

  • Use a chrome based browser (One member with a different browser can crash the whole meeting).
  • Mute your microphone and disable your camera.
  • If you have a question, raise your hand.

More information is available at https://www.rhrk.uni-kl.de/dienstleistungen/netz-telefonie/konferenzdienste/jitsi/.

Bookmark the permalink.