RACF: A Resilient Autonomous Car Framework with Object Distance Correction

arXiv cs.RO / 4/15/2026

📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research

Key Points

  • RACF(Resilient Autonomous Car Framework)は、深度カメラ・LiDAR・物理ベースの運動学を冗長かつ多様に組み合わせることで、自動運転の知覚層の頑健性を高める枠組みを提案しています。
  • 深度カメラによる障害物距離推定が他センサー等と矛盾した場合に、クロスセンサーのゲートでODCA(Object Distance Correction Algorithm)を起動し、不整合を補正する仕組みになっています。
  • Quanser QCar 2上のテストベッドで実験し、強い破損(corruption)下でRMSEを最大35%低減し、停止遵守(stop compliance)やブレーキの遅延時間を改善しつつリアルタイム動作を維持したと報告しています。
  • 環境劣化や敵対的摂動に対して、従来の“反応遅れ”が起きがちな防御ではなく、軽量で実運用に近い知覚レジリエンス手法を示す点がポイントです。

Abstract

Autonomous vehicles are increasingly deployed in safety-critical applications, where sensing failures or cyberphysical attacks can lead to unsafe operations resulting in human loss and/or severe physical damages. Reliable real-time perception is therefore critically important for their safe operations and acceptability. For example, vision-based distance estimation is vulnerable to environmental degradation and adversarial perturbations, and existing defenses are often reactive and too slow to promptly mitigate their impacts on safe operations. We present a Resilient Autonomous Car Framework (RACF) that incorporates an Object Distance Correction Algorithm (ODCA) to improve perception-layer robustness through redundancy and diversity across a depth camera, LiDAR, and physics-based kinematics. Within this framework, when obstacle distance estimation produced by depth camera is inconsistent, a cross-sensor gate activates the correction algorithm to fix the detected inconsistency. We have experiment with the proposed resilient car framework and evaluate its performance on a testbed implemented using the Quanser QCar 2 platform. The presented framework achieved up to 35% RMSE reduction under strong corruption and improves stop compliance and braking latency, while operating in real time. These results demonstrate a practical and lightweight approach to resilient perception for safety-critical autonomous driving