2015 ⟶ Highway and Residual Networks Developed for Deep Learning
Two techniques were developed concurrently to train very dee...Year
1965
1967
1970
2015
🚀 Development of First Deep Learning Algorithm
Alexey Ivakhnenko and Valentin Lapa developed the first Deep learning algorithm for multilayer perceptrons in Soviet Union.⟶

Deep LearningAlexey IvakhnenkoValentin LapaMultilayer PerceptronsNeural NetworksEarly Deep Learning

🧠 Stochastic Gradient Descent for Deep Learning
Shun'ichi Amari was the first to use Stochastic gradient descent for Deep learning in multilayer perceptrons. In computer experiments conducted by his student Saito, a five layer MLP with two modifiable layers learned useful internal representations to classify non-linearily separable pattern classes.⟶

Deep LearningStochastic Gradient DescentMultilayer PerceptronsAmariNeural NetworksPattern Recognition

🧮 Linnainmaa Publishes Reverse Mode Automatic Differentiation
Seppo Linnainmaa publishes the reverse mode of Automatic differentiation. This method became later known as Backpropagation, and is heavily used to train Artificial neural networks.⟶

BackpropagationAutomatic DifferentiationNeural NetworksMachine LearningDeep LearningOptimizationTraining AlgorithmsSeppo Linnainmaa

➕ Highway and Residual Networks Developed for Deep Learning
Two techniques were developed concurrently to train very deep networks: Highway network, and the Residual neural network (ResNet). They allowed over 1000-layers-deep networks to be trained.⟶

Deep LearningNeural NetworksHighway NetworksResidual NetworksResNetTraining Techniques