Local learning for stable backpropagation-free neural network training towards physical learning

arXiv cs.LG / 3/27/2026

💬 OpinionSignals & Early TrendsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper motivates backpropagation-free training by pointing to physical constraints of chip fabrication and environmental costs, arguing for more physically realizable learning paradigms like physical neural networks.
  • It proposes FFzero, a forward-only framework that avoids both backpropagation and automatic differentiation by using layer-wise local learning, prototype-based representations, and optimization via directional derivatives computed through forward evaluations.
  • The authors report that local learning can remain effective under forward-only optimization, specifically in settings where backpropagation-based methods fail.
  • FFzero is demonstrated to generalize to multilayer perceptrons and convolutional neural networks for both classification and regression tasks.
  • A simulated photonic neural network experiment is used to suggest that FFzero could support viable in-situ physical learning without traditional gradient backpropagation.

Abstract

While backpropagation and automatic differentiation have driven deep learning's success, the physical limits of chip manufacturing and rising environmental costs of deep learning motivate alternative learning paradigms such as physical neural networks. However, most existing physical neural networks still rely on digital computing for training, largely because backpropagation and automatic differentiation are difficult to realize in physical systems. We introduce FFzero, a forward-only learning framework enabling stable neural network training without backpropagation or automatic differentiation. FFzero combines layer-wise local learning, prototype-based representations, and directional-derivative-based optimization through forward evaluations only. We show that local learning is effective under forward-only optimization, where backpropagation fails. FFzero generalizes to multilayer perceptron and convolutional neural networks across classification and regression. Using a simulated photonic neural network as an example, we demonstrate that FFzero provides a viable path toward backpropagation-free in-situ physical learning.