本周课程主要讲解了神经网络的一些优化算法,要点: - 随机梯度下降 - Momentum(动量)优化算法 - RMSProp优化算法 - Adam优化算法
学习目标 - Remember different optimization methods such as (Stochastic) Gradient Descent, Momentum, RMSProp and Adam - Use random minibatches to accelerate the convergence and improve the optimization - Know the benefits of learning rate decay and apply it to your optimization