CLOTH-HUGS: Cloth Aware Human Gaussian Splatting
arXiv cs.CV / 4/20/2026
📰 NewsModels & Research
Key Points
- Cloth-HUGS is a Gaussian Splatting–based neural rendering framework aimed at photorealistic reconstruction of clothed humans by explicitly separating body and clothing rather than merging them into a single representation.
- The method represents body and cloth with separate Gaussian layers in a shared canonical space and deforms them using SMPL-driven articulation with learned linear blend skinning weights.
- To make loose garments and deformations more realistic, it initializes cloth Gaussians from mesh topology and applies physics-inspired constraints such as simulation-consistency, ARAP regularization, and mask supervision.
- It uses a depth-aware multi-pass rendering approach to robustly composite body, cloth, and scene, achieving real-time performance at over 60 FPS.
- Experiments report improved perceptual quality and geometric fidelity over state-of-the-art methods, including up to a 28% reduction in LPIPS and temporally coherent cloth dynamics.
Related Articles

From Theory to Reality: Why Most AI Agent Projects Fail (And How Mine Did Too)
Dev.to

GPT-5.4-Cyber: OpenAI's Game-Changer for AI Security and Defensive AI
Dev.to
Local LLM Beginner’s Guide (Mac - Apple Silicon)
Reddit r/artificial

Is Your Skill Actually Good? Systematically Validating Agent Skills with Evals
Dev.to

Space now with memory
Dev.to