Near-Exhaustive Precomputation of Secondary Cloth Effects
Visualization of a 99,352-frame cloth motion graph generated using over 4,554 CPU hours of precomputation. Each path from the root to a leaf of the tree corresponds to a simulated trajectory. Cloth trajectories are colored by their corresponding character motion. Light gray "back-links" complete the motion graph structure.
Four cloth poses from a cartwheel animation. The entire compressed cloth database needed at runtime has only a 66 MB memory footprint. Cloth animation can easily be played back in real time on low-end devices.

Doyub Kim, CMU
Woojong Koh, UC Berkeley
Rahul Narain, UC Berkeley
Kayvon Fatahalian, CMU
Adrien Treuille, CMU
James O'Brien, UC Berkeley

ACM Transactions on Graphics, 32(4):87:1-7, July 2013.
Proceedings of ACM SIGGRAPH 2013, Anaheim.

Abstract

The central argument against data-driven methods in computer graphics rests on the curse of dimensionality: it is intractable to precompute "everything" about a complex space. In this paper, we challenge that assumption by using several thousand CPU-hours to perform a massive exploration of the space of secondary clothing effects on a character animated through a large motion graph. Our system continually explores the phase space of cloth dynamics, incrementally constructing a secondary cloth motion graph that captures the dynamics of the system. We find that it is possible to sample the dynamical space to a low visual error tolerance and that secondary motion graphs containing tens of gigabytes of raw mesh data can be compressed down to only tens of megabytes. These results allow us to capture the effect of high-resolution, off-line cloth simulation for a rich space of character motion and deliver it efficiently as part of an interactive application.

Video

View MP4 (132 MB)

Paper

Download Paper (pdf, 5 MB)

Press

TechCrunch: Researchers Create Near-Exhaustive, Ultra-Realistic Cloth Simulation

CNET: Computers sweat for 4,554 hours to simulate cloth movement