SLE-FNO: Single-Layer Extensions for Task-Agnostic Continual Learning in Fourier Neural Operators
arXiv cs.LG / 2026/3/24
💬 オピニオンIdeas & Deep AnalysisModels & Research
要点
- The paper proposes SLE-FNO, an architecture-based continual learning method that extends Fourier Neural Operators (FNO) using Single-Layer Extensions (SLE) to adapt to distribution shifts without reusing past training data.
- SLE-FNO is evaluated on an image-to-image regression fluid dynamics problem: mapping transient concentration fields to time-averaged wall shear stress (TAWSS) in pulsatile aneurysmal blood flow.
- Using 230 CFD simulations split into four sequential, out-of-distribution task configurations, the authors compare SLE-FNO against established continual learning baselines including EWC, LwF, replay methods, OGD, GEM, PiggyBack, and LoRA.
- Results indicate replay-based methods and architecture-based approaches (PiggyBack, LoRA, and SLE-FNO) yield the best retention, with SLE-FNO achieving the strongest overall trade-off between plasticity and stability.
- The study reports that SLE-FNO delivers strong accuracy with zero forgetting and minimal added parameters, positioning it as a promising way to update baseline models when extrapolation is required.

