AdaMeZO: Adam-style Zeroth-Order Optimizer for LLM Fine-tuning Without Maintaining the Moments

arXiv cs.LG / 5/4/2026

📰 NewsModels & Research

Key Points

  • The paper introduces AdaMeZO, an Adam-style zeroth-order optimizer designed for LLM fine-tuning using only forward passes, aimed at reducing GPU memory usage.
  • Unlike MeZO, which is insensitive to the loss landscape and may converge more slowly, AdaMeZO uses first- and second-moment estimates to better navigate curvature without storing those moments in memory.
  • The authors show through theoretical analysis and extensive experiments that AdaMeZO can outperform MeZO while improving efficiency, reducing the number of required forward passes by up to 70%.
  • Visualization of optimization trajectories suggests AdaMeZO can adapt to different loss landscapes, supporting the claim that the moment-based guidance improves optimization behavior.

Abstract

Fine-tuning LLMs is necessary for various dedicated downstream tasks, but classic backpropagation-based fine-tuning methods require substantial GPU memory. To this end, a recent work, MeZO, which relies solely on forward passes to fine-tune LLMs, significantly reduces GPU requirements at the cost of slower convergence due to its indifference to loss landscapes. Standard solutions, such as Adam, explore loss landscapes by estimating the first- and second-order moments and storing them in memory to guide the model's movement through dimensions with lower curvature and vice versa. However, directly applying Adam negates MeZO's advantage as it will triple the memory requirement. In light of this, we propose AdaMeZO, a zeroth-order optimizer that leverages Adam-style first- and second-moment estimates without maintaining them in memory. We present a theoretical analysis of AdaMeZO, corroborated by extensive experiments demonstrating AdaMeZO's performance, showing that AdaMeZO can outperform MeZO while requiring up to 70\% fewer forward passes. Trajectory visualizations affirm AdaMeZO's ability to adapt to diverse loss landscapes.