View-Dependent Adaptive Cloth Simulation with Buckling Compensation

Woojong Koh, Rahul Narain, James F. O'Brien

Research output: Contribution to journalArticlepeer-review

9 Scopus citations

Abstract

This paper describes a method for view-dependent cloth simulation using dynamically adaptive mesh refinement and coarsening. Given a prescribed camera motion, the method adjusts the criteria controlling refinement to account for visibility and apparent size in the camera's view. Objectionable dynamic artifacts are avoided by anticipative refinement and smoothed coarsening, while locking in extremely coarsened regions is inhibited by modifying the material model to compensate for unresolved sub-element buckling. This approach preserves the appearance of detailed cloth throughout the animation while avoiding the wasted effort of simulating details that would not be discernible to the viewer. The computational savings realized by this method increase as scene complexity grows. The approach produces a 2× speed-up for a single character and more than 4× for a small group as compared to view-independent adaptive simulations, and respectively 5× and 9× speed-ups as compared to non-adaptive simulations.

Original languageEnglish (US)
Article number7127098
Pages (from-to)1138-1145
Number of pages8
JournalIEEE Transactions on Visualization and Computer Graphics
Volume21
Issue number10
DOIs
StatePublished - Oct 1 2015

Bibliographical note

Publisher Copyright:
© 2015 IEEE.

Keywords

  • Physically based modeling
  • adaptive remeshing
  • animation
  • cloth simulation

Fingerprint

Dive into the research topics of 'View-Dependent Adaptive Cloth Simulation with Buckling Compensation'. Together they form a unique fingerprint.

Cite this