FEAT: A Linear-Complexity Foundation Model for Extremely Large Structured Data
arXiv cs.LG / 3/18/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- FEAT is a new linear-complexity foundation model designed for extremely large structured data across domains such as healthcare, finance, e-commerce, and scientific data management.
- It replaces quadratic self-attention with a hybrid linear encoding in a multi-layer dual-axis architecture, combining adaptive-fusion bi-Mamba-2 for local dependencies and convolutional gated linear attention for global memory.
- The model uses a hybrid structural causal model pipeline and a stable reconstruction objective to improve robustness beyond synthetic-only pre-training.
- In experiments on 11 real-world datasets, FEAT outperforms baselines in zero-shot performance, scales linearly, and delivers up to 40x faster inference.
Related Articles

The programming passion is melting
Dev.to

Maximize Developer Revenue with Monetzly's Innovative API for AI Conversations
Dev.to
Co-Activation Pattern Detection for Prompt Injection: A Mechanistic Interpretability Approach Using Sparse Autoencoders
Reddit r/LocalLLaMA

How to Train Custom Language Models: Fine-Tuning vs Training From Scratch (2026)
Dev.to

KoboldCpp 1.110 - 3 YR Anniversary Edition, native music gen, qwen3tts voice cloning and more
Reddit r/LocalLLaMA