CLOTH-HUGS: Cloth Aware Human Gaussian Splatting

arXiv cs.CV / 4/20/2026

📰 NewsModels & Research

Key Points

  • Cloth-HUGS is a Gaussian Splatting–based neural rendering framework aimed at photorealistic reconstruction of clothed humans by explicitly separating body and clothing rather than merging them into a single representation.
  • The method represents body and cloth with separate Gaussian layers in a shared canonical space and deforms them using SMPL-driven articulation with learned linear blend skinning weights.
  • To make loose garments and deformations more realistic, it initializes cloth Gaussians from mesh topology and applies physics-inspired constraints such as simulation-consistency, ARAP regularization, and mask supervision.
  • It uses a depth-aware multi-pass rendering approach to robustly composite body, cloth, and scene, achieving real-time performance at over 60 FPS.
  • Experiments report improved perceptual quality and geometric fidelity over state-of-the-art methods, including up to a 28% reduction in LPIPS and temporally coherent cloth dynamics.

Abstract

We present Cloth-HUGS, a Gaussian Splatting based neural rendering framework for photorealistic clothed human reconstruction that explicitly disentangles body and clothing. Unlike prior methods that absorb clothing into a single body representation and struggle with loose garments and complex deformations, Cloth-HUGS represents the performer using separate Gaussian layers for body and cloth within a shared canonical space. The canonical volume jointly encodes body, cloth, and scene primitives and is deformed through SMPL-driven articulation with learned linear blend skinning weights. To improve cloth realism, we initialize cloth Gaussians from mesh topology and apply physics-inspired constraints, including simulation-consistency, ARAP regularization, and mask supervision. We further introduce a depth-aware multi-pass rendering strategy for robust body-cloth-scene compositing, enabling real-time rendering at over 60 FPS. Experiments on multiple benchmarks show that Cloth-HUGS improves perceptual quality and geometric fidelity over state-of-the-art baselines, reducing LPIPS by up to 28% while producing temporally coherent cloth dynamics.

CLOTH-HUGS: Cloth Aware Human Gaussian Splatting | AI Navigate