LAtte: Hyperbolic Lorentz Attention for Cross-Subject EEG Classification
arXiv cs.LG / 3/12/2026
📰 NewsSignals & Early TrendsModels & Research
Key Points
- LAtte introduces a Lorentz Attention Module integrated with an InceptionTime-based encoder to enable robust cross-subject EEG classification.
- It learns a shared baseline across subjects via pretraining and uses Lorentz low-rank adapters to model subject-specific differences via embeddings.
- The model aims for generalization to unseen subjects and can be finetuned per-subject, addressing inter-subject variability and low EEG SNR.
- Evaluation on three EEG datasets shows substantial performance improvements over current state-of-the-art methods.
Related Articles
ベテランの若手育成負担を減らせ、PLC制御の「ラダー図」をAIで生成
日経XTECH

14 Best Self-Hosted Claude Alternatives for AI and Coding in 2026
Dev.to
Top Web Development Trends in 2026
Dev.to
[P] Finetuned small LMs to VLM adapters locally and wrote a short article about it
Reddit r/MachineLearning
Experiment: How far can a 28M model go in business email generation?
Reddit r/LocalLLaMA