PaperHub

暂无评分数据

ICLR 2025

A Comprehensive Framework for Analyzing the Convergence of Adam: Bridging the Gap with Stochastic Gradient Descent

OpenReviewPDF
提交: 2024-09-26更新: 2024-10-23
TL;DR

We present a novel framework for analyzing Adam, which demonstrates that Adam's convergence guarantees can be aligned with those of SGD.

摘要

Adaptive Moment Estimation (Adam) is a cornerstone optimization algorithm in deep learning, widely recognized for its flexibility with adaptive learning rates and efficiency in handling large-scale data. However, despite its practical success, the theoretical understanding of Adam's convergence has been constrained by stringent assumptions, such as almost surely bounded stochastic gradients or uniformly bounded gradients, which are more restrictive than those typically required for analyzing stochastic gradient descent (SGD). In this paper, we introduce a novel and comprehensive framework for analyzing the convergence properties of Adam. This framework offers a versatile approach to establishing Adam's convergence. Specifically, we prove that Adam achieves asymptotic (last iterate sense) convergence in both the almost sure sense and the \(L_1\) sense under the relaxed assumptions typically used for SGD, namely \(L\)-smoothness and the ABC inequality. Meanwhile, under the same assumptions, we show that Adam attains non-asymptotic sample complexity bounds similar to those of SGD.
关键词
Adamalmost sure convergencel1 convergencesample complexityanalysis framework

评审与讨论

撤稿通知

Some tiny typos need to fix.