SLE-FNO: Single-Layer Extensions for Task-Agnostic Continual Learning in Fourier Neural Operators
arXiv cs.LG / 3/24/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- The paper proposes SLE-FNO, an architecture-based continual learning method that extends Fourier Neural Operators (FNO) using Single-Layer Extensions (SLE) to adapt to distribution shifts without reusing past training data.
- SLE-FNO is evaluated on an image-to-image regression fluid dynamics problem: mapping transient concentration fields to time-averaged wall shear stress (TAWSS) in pulsatile aneurysmal blood flow.
- Using 230 CFD simulations split into four sequential, out-of-distribution task configurations, the authors compare SLE-FNO against established continual learning baselines including EWC, LwF, replay methods, OGD, GEM, PiggyBack, and LoRA.
- Results indicate replay-based methods and architecture-based approaches (PiggyBack, LoRA, and SLE-FNO) yield the best retention, with SLE-FNO achieving the strongest overall trade-off between plasticity and stability.
- The study reports that SLE-FNO delivers strong accuracy with zero forgetting and minimal added parameters, positioning it as a promising way to update baseline models when extrapolation is required.
Related Articles

Composer 2: What is new and Compares with Claude Opus 4.6 & GPT-5.4
Dev.to
How UCP Breaks Your E-Commerce Tracking Stack: A Platform-by-Platform Analysis
Dev.to
AI Text Analyzer vs Asking Friends: Which Gives Better Perspective?
Dev.to
[D] Cathie wood claims ai productivity wave is starting, data shows 43% of ceos save 8+ hours weekly
Reddit r/MachineLearning

Microsoft hires top AI researchers from Allen Institute for AI for Suleyman's Superintelligence team
THE DECODER