Mantis: Mamba-native Tuning is Efficient for 3D Point Cloud Foundation Models
arXiv cs.CV / 5/6/2026
📰 NewsDeveloper Stack & InfrastructureTools & Practical UsageModels & Research
Key Points
- The paper presents Mantis, a new Mamba-native parameter-efficient fine-tuning (PEFT) framework specifically for pre-trained 3D point cloud foundation models (PFMs).
- It argues that existing PEFT methods for Transformer backbones do not transfer well to frozen Mamba models due to a mismatch between token-level adaptation and Mamba’s state-level sequence dynamics.
- Mantis introduces a State-Aware Adapter (SAA) that injects lightweight, task-conditioned control signals into selected state-space updates to enable stable state-level adaptation while keeping the backbone frozen.
- It also proposes Dual-Serialization Consistency Distillation (DSCD) to regularize different point cloud serializations and mitigate instability caused by how the point clouds are serialized.
- Experiments on multiple benchmarks show Mantis achieves competitive results while training only about 5% of the parameters, and the authors provide open-source code.
Related Articles

Black Hat USA
AI Business

Top 10 Free AI Tools for Students in 2026: The Ultimate Study Guide
Dev.to

PaioClaw Review: What You Actually Get for $15/mo vs DIY OpenClaw
Dev.to

PaioClaw Review: What You Actually Get for $15/mo vs DIY OpenClaw
Dev.to

SIFS (SIFS Is Fast Search) - local code search for coding agents
Dev.to