Dec 19, 2019
In this episode, we are focusing on Deep Learning Optimizers, the different Gradient Descent Variants from vanilla GD to RAdam and Ranger. Levente tells us the story of GD from the simplest ones to the newest ones. This is part 1.
Levente's Linkedin URL:
An overview of gradient descent optimization
Lookahead optimizer algorithm:
Ranger optimizer by Less Wright:
Music is from https://filmmusic.io, intro first part is by Miklos Toth and some free garage band loops. :) intro second part: "Aces High" by Kevin MacLeod, outro "Acid Trumpet" by Kevin MacLeod (https://incompetech.com), License: CC BY (http://creativecommons.org/licenses/by/4.0/)