SAMe: A Semantic Anatomy Mapping Engine for Robotic Ultrasound
arXiv cs.CV / 4/29/2026
📰 NewsDeveloper Stack & InfrastructureSignals & Early TrendsModels & Research
Key Points
- The paper introduces SAMe, a semantic anatomy mapping engine designed to give robotic ultrasound an explicit anatomical prior layer for better scan initiation.
- SAMe converts under-specified clinical complaints into structured target organs, builds a patient-specific anatomical representation from a single external body image, and outputs 6-DoF probe initialization states.
- The approach avoids additional registration by using preoperative CT/MRI, aiming to simplify the workflow while improving autonomy.
- SAMe maintains an explicit, lightweight anatomical representation (single-organ inference in about 0.08 seconds) that is intended to be directly compatible with downstream robotic control.
- In real-robot experiments, SAMe achieved high organ-hit rates—97.3% for liver and 81.7% for kidney—and outperformed a surface-heuristic baseline even under restricted conditions.
Related Articles
LLMs will be a commodity
Reddit r/artificial

HubSpot Just Legitimized AEO: What It Means for Your Brand AI Visibility
Dev.to

What it feels like to have to have Qwen 3.6 or Gemma 4 running locally
Reddit r/LocalLLaMA

From Fault Codes to Smart Fixes: How Google Cloud NEXT ’26 Inspired My AI Mechanic Assistant
Dev.to

Dex lands $5.3M to grow its AI-driven talent matching platform
Tech.eu