BarbieGait: An Identity-Consistent Synthetic Human Dataset with Versatile Cloth-Changing for Gait Recognition
arXiv cs.CV / 4/15/2026
📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research
Key Points
- The paper introduces BarbieGait, a synthetic human gait dataset that maps real individuals into a virtual engine to generate extensive clothing variations while maintaining identity-consistent gait information for cross-clothing evaluation.
- By controllably generating large amounts of synthetic training/benchmark data, BarbieGait aims to address a key real-world limitation: verifying and measuring clothing-induced variations is difficult with real captures alone.
- The authors identify cross-clothing gait recognition as a cloth-invariance challenge and propose GaitCLIF, a baseline model designed to learn robust cloth-invariant features.
- Experiments show that GaitCLIF substantially improves cross-clothing performance on BarbieGait and also on existing popular gait recognition benchmarks.
- The work positions BarbieGait and the associated modeling baseline as enabling further progress in gait recognition under clothing changes, potentially advancing research in cloth-robust biometrics.




