InterPhys: Physics-aware Human Motion Synthesis in a Dynamic Scene

arXiv cs.CV / 5/5/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper presents a physics-aware framework for human motion synthesis in dynamic scenes, aiming to reduce physically implausible outputs caused by limited contact modeling in prior work.
  • It explicitly models forces across multiple interaction types, including human–object, human–scene, and internal body dynamics, and enforces force/torque balance via soft physical constraints.
  • A new continuous, distance-based force model is introduced to generalize contact modeling across arbitrary surfaces, enabling interactions with both static and moving objects.
  • Experiments indicate the method improves physical plausibility and generalization in complex scenes, and the authors claim it establishes a new benchmark for physically consistent human motion generation.

Abstract

This paper tackles the problem of physics-aware human motion synthesis in a dynamic scene. Unlike existing works which mainly tend to generate physically unrealistic motions due to limited contact modeling, typically restricted to hands, in this paper, we introduce a physics-aware human motion generation framework that explicitly models the full spectrum of human-related forces, including human-object, human-scene, and internal body dynamics.~Our method imposes soft physical constraints to maintain force and torque balance, ensuring physically grounded motion synthesis. We further propose a novel continuous distance-based force model that generalizes contact modeling to arbitrary surfaces, capturing interactions not only with static environments but also with dynamic, moving objects. Extensive experiments show that our approach significantly improves physical plausibility and generalizes well to complex scenes, setting a new benchmark for physically consistent human motion generation.