ParkingScenes: A Structured Dataset for End-to-End Autonomous Parking in Simulation Scenes
arXiv cs.CV / 4/28/2026
📰 NewsDeveloper Stack & InfrastructureSignals & Early TrendsModels & Research
Key Points
- ParkingScenes is a new multimodal, structured dataset for end-to-end autonomous parking in simulation, created to address the lack of high-quality parking-specific data.
- The dataset is built on CARLA and uses a Hybrid A* planner plus an MPC to generate structured parking trajectories that provide accurate, reproducible supervision.
- It contains 16 reverse-in and 6 parallel parking scenarios, each under two pedestrian conditions (present/absent), totaling 704 structured episodes and about 105,000 frames, with consistent scenario repetition.
- Each frame includes synchronized inputs from four RGB cameras, four depth sensors, vehicle motion states, and BEV representations, supporting rich multimodal fusion.
- Experiments comparing models trained on ParkingScenes versus unstructured manually collected simulation data show significant performance gains, and the dataset plus collection framework are planned for public release.
Related Articles

Big Tech firms are accelerating AI investments and integration, while regulators and companies focus on safety and responsible adoption.
Dev.to
How I Automate My Dev Workflow with Claude Code Hooks
Dev.to

Same Agent, Different Risk | How Microsoft 365 Copilot Grounding Changes the Security Model | Rahsi Framework™
Dev.to

Claude Haiku for Low-Cost AI Inference: Patterns from a Horse Racing Prediction System
Dev.to

How We Built an Ambient AI Clinical Documentation Pipeline (and Saved Doctors 8+ Hours a Week)
Dev.to