SAMe: A Semantic Anatomy Mapping Engine for Robotic Ultrasound

arXiv cs.CV / 4/29/2026

📰 NewsDeveloper Stack & InfrastructureSignals & Early TrendsModels & Research

Key Points

  • The paper introduces SAMe, a semantic anatomy mapping engine designed to give robotic ultrasound an explicit anatomical prior layer for better scan initiation.
  • SAMe converts under-specified clinical complaints into structured target organs, builds a patient-specific anatomical representation from a single external body image, and outputs 6-DoF probe initialization states.
  • The approach avoids additional registration by using preoperative CT/MRI, aiming to simplify the workflow while improving autonomy.
  • SAMe maintains an explicit, lightweight anatomical representation (single-organ inference in about 0.08 seconds) that is intended to be directly compatible with downstream robotic control.
  • In real-robot experiments, SAMe achieved high organ-hit rates—97.3% for liver and 81.7% for kidney—and outperformed a surface-heuristic baseline even under restricted conditions.

Abstract

Robotic ultrasound has advanced local image-driven control, contact regulation, and view optimization, yet current systems lack the anatomical understanding needed to determine what to scan, where to begin, and how to adapt to individual patient anatomy. These gaps make systems still reliant on expert intervention to initiate scanning. Here we present SAMe, a semantic anatomy mapping engine that provides robotic ultrasound with an explicit anatomical prior layer. SAMe addresses scan initiation as a target-to-anatomy-to-action process: it grounds under-specified clinical complaints into structured target organs, instantiates a patient-specific anatomical representation for the grounded targets from a single external body image, and translates this representation into control-facing 6-DoF probe initialization states without any additional registration using preoperative CT or MRI. The anatomical representation maintained by SAMe is explicit, lightweight (single-organ inference in 0.08s), and compatible with downstream control by design. Across semantic grounding, anatomical instantiation, and real-robot evaluation, SAMe shows strong performance across the full initialization pipeline. In real-robot experiments, SAMe achieved overall organ-hit rates of 97.3% for liver initialization and 81.7% for kidney initialization across the evaluated target sets. Even when restricted to the centroid target, SAMe outperformed the surface-heuristic baseline for both liver and kidney initialization. These results establish an explicit anatomical prior layer that addresses scan initialization and is designed to support broader downstream autonomous scanning pipelines, providing the anatomical foundation for complaint-driven, anatomically informed robotic ultrasonography.