AI Navigate

Pointy - A Lightweight Transformer for Point Cloud Foundation Models

arXiv cs.CV / 3/12/2026

📰 NewsModels & Research

Key Points

  • Pointy introduces a lightweight transformer-based architecture for point cloud foundation models that reduces reliance on cross-modal supervision.
  • The model is trained on just 39k point clouds yet outperforms several larger foundation models trained on 200k+ samples, challenging data-volume assumptions.
  • The authors perform a comprehensive replication study with standardized training regimes to isolate architectural contributions and compare tokenizer-free backbones.
  • Results show simple backbones can approach state-of-the-art results achieved by data- and modality-rich models, highlighting the value of careful design.
  • The work provides open-source code, pre-trained models, and training protocols on GitHub for broader replication and use.

Abstract

Foundation models for point cloud data have recently grown in capability, often leveraging extensive representation learning from language or vision. In this work, we take a more controlled approach by introducing a lightweight transformer-based point cloud architecture. In contrast to the heavy reliance on cross-modal supervision, our model is trained only on 39k point clouds - yet it outperforms several larger foundation models trained on over 200k training samples. Interestingly, our method approaches state-of-the-art results from models that have seen over a million point clouds, images, and text samples, demonstrating the value of a carefully curated training setup and architecture. To ensure rigorous evaluation, we conduct a comprehensive replication study that standardizes the training regime and benchmarks across multiple point cloud architectures. This unified experimental framework isolates the impact of architectural choices, allowing for transparent comparisons and highlighting the benefits of our design and other tokenizer-free architectures. Our results show that simple backbones can deliver competitive results to more complex or data-rich strategies. The implementation, including code, pre-trained models, and training protocols, is available at https://github.com/KonradSzafer/Pointy.