LAtte: Hyperbolic Lorentz Attention for Cross-Subject EEG Classification
arXiv cs.LG / 3/12/2026
📰 NewsSignals & Early TrendsModels & Research
Key Points
- LAtte introduces a Lorentz Attention Module integrated with an InceptionTime-based encoder to enable robust cross-subject EEG classification.
- It learns a shared baseline across subjects via pretraining and uses Lorentz low-rank adapters to model subject-specific differences via embeddings.
- The model aims for generalization to unseen subjects and can be finetuned per-subject, addressing inter-subject variability and low EEG SNR.
- Evaluation on three EEG datasets shows substantial performance improvements over current state-of-the-art methods.
Related Articles
I Was Wrong About AI Coding Assistants. Here's What Changed My Mind (and What I Built About It).
Dev.to

Interesting loop
Reddit r/LocalLLaMA
Qwen3.5-122B-A10B Uncensored (Aggressive) — GGUF Release + new K_P Quants
Reddit r/LocalLLaMA
FeatherOps: Fast fp8 matmul on RDNA3 without native fp8
Reddit r/LocalLLaMA

VerityFlow-AI: Engineering a Multi-Agent Swarm for Real-Time Truth-Validation and Deep-Context Media Synthesis
Dev.to