MotiMem: Motion-Aware Approximate Memory for Energy-Efficient Neural Perception in Autonomous Vehicles
arXiv cs.CV / 3/31/2026
📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research
Key Points
- The paper introduces MotiMem, a motion-aware approximate memory interface designed to reduce data-movement energy in neural perception systems for battery-constrained autonomous vehicles.
- It leverages temporal coherence using lightweight 2D motion propagation to identify Regions of Interest (RoI) dynamically, aiming to avoid unnecessary sensor data movement.
- A hybrid sparsity-aware coding approach uses adaptive inversion and truncation to create bit-level sparsity, further lowering memory-interface dynamic energy.
- Across nuScenes, Waymo, and KITTI using 16 detection models, MotiMem cuts memory-interface dynamic energy by about 43% while preserving roughly 93% of object detection accuracy.
- The results claim a new, improved energy–accuracy Pareto frontier compared with standard, semantically blind codecs like JPEG and WebP.



