BarbieGait: An Identity-Consistent Synthetic Human Dataset with Versatile Cloth-Changing for Gait Recognition

arXiv cs.CV / 4/15/2026

📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper introduces BarbieGait, a synthetic human gait dataset that maps real individuals into a virtual engine to generate extensive clothing variations while maintaining identity-consistent gait information for cross-clothing evaluation.
  • By controllably generating large amounts of synthetic training/benchmark data, BarbieGait aims to address a key real-world limitation: verifying and measuring clothing-induced variations is difficult with real captures alone.
  • The authors identify cross-clothing gait recognition as a cloth-invariance challenge and propose GaitCLIF, a baseline model designed to learn robust cloth-invariant features.
  • Experiments show that GaitCLIF substantially improves cross-clothing performance on BarbieGait and also on existing popular gait recognition benchmarks.
  • The work positions BarbieGait and the associated modeling baseline as enabling further progress in gait recognition under clothing changes, potentially advancing research in cloth-robust biometrics.

Abstract

Gait recognition, as a reliable biometric technology, has seen rapid development in recent years while it faces significant challenges caused by diverse clothing styles in the real world. This paper introduces BarbieGait, a synthetic gait dataset where real-world subjects are uniquely mapped into a virtual engine to simulate extensive clothing changes while preserving their gait identity information. As a pioneering work, BarbieGait provides a controllable gait data generation method, enabling the production of large datasets to validate cross-clothing issues that are difficult to verify with real-world data. However, the diversity of clothing increases intra-class variance and makes one of the biggest challenges to learning cloth-invariant features under varying clothing conditions. Therefore, we propose GaitCLIF (Gait-oriented CLoth-Invariant Feature) as a robust baseline model for cross-clothing gait recognition. Through extensive experiments, we validate that our method significantly improves cross-clothing performance on BarbieGait and the existing popular gait benchmarks. We believe that BarbieGait, with its extensive cross-clothing gait data, will further advance the capabilities of gait recognition in cross-clothing scenarios and promote progress in related research.