ATLAS: An Annotation Tool for Long-horizon Robotic Action Segmentation
arXiv cs.AI / 4/30/2026
💬 OpinionDeveloper Stack & InfrastructureTools & Practical UsageModels & Research
Key Points
- The paper introduces ATLAS, an annotation tool designed specifically for long-horizon robotic action segmentation with accurate temporal boundaries.
- ATLAS enables time-synchronized visualization of multi-modal robot data, including multi-view video plus proprioceptive signals like gripper state and force/torque.
- It supports widely used robotics dataset formats (e.g., ROS bags and RLDS) and provides direct support for datasets such as REASSEMBLE, with an extensible modular layer for new formats.
- In experiments on a contact-rich assembly task, ATLAS cut average per-action annotation time by at least 6% versus ELAN, improved expert temporal alignment by over 2.8%, and reduced boundary error by about fivefold compared with vision-only tools.
- The tool uses a keyboard-centric interface to minimize annotation effort and increase annotation efficiency for training/evaluating manipulation policy learning methods.
Related Articles

Black Hat USA
AI Business
Vector DB and ANN vs PHE conflict, is there a practical workaround? [D]
Reddit r/MachineLearning

Agent Amnesia and the Case of Henry Molaison
Dev.to

Azure Weekly: Microsoft and OpenAI Restructure Partnership as GPT-5.5 Lands in Foundry
Dev.to

Proven Patterns for OpenAI Codex in 2026: Prompts, Validation, and Gateway Governance
Dev.to