深層学習の最適化アルゴリズムまとめ
目次 1. 最適化アルゴリズムとは 2. 最適化アルゴリズムのまとめ ・確率的勾配降下法(SGD) ・Adams ・Adamax ・Nadam ・AMSGrad ・AdamW 3.その他 AdaDelta, AdaGrad, A2GradExp, A2GradInc, A2GradUni, AccSGD, AdaBelief, AdaMod, Adafactor, Adahessian, AdamP, AggMo, Apollo, DiffGrad, RMSProp, AveragedOptimizerWrapper, ConditionalGradient, CyclicalLearningRate, ExponentialCyclicalLearningRate, extend_with_decoupled_weight_decay, LAMB, LazyAdam, Lookahead, MovingAverage, NovoGrad, ProximalAdagrad, RectifiedAdam, SGDW, SWA, Triangular2CyclicalLearningRate, TriangularCyclicalLearningRate, Yogi, AdaBound, AMSBound, Shampoo, SWATS, SGDP, Ranger, RangerQH, RangerVA, PID, QHAdam, QHM, …