HairControl: A Tracking Solution for Directable Hair Simulation
We present a method for adding artistic control to physics-based hair simulation.
July 11, 2018
ACM SIGGRAPH / Eurographics Symposium on Computer Animation (SCA) 2018
Authors
Antoine Milliez (Disney Research/ETH Joint PhD)
Robert W. Sumner (Disney Research/ETH Zurich)
Markus Gross (Disney Research/ETH Zurich)
Bernhard Thomaszewski (Disney Research/Université de Montréal)
HairControl: A Tracking Solution for Directable Hair Simulation
Taking as input an animation of a coarse set of guide hairs, we constrain a subsequent higher-resolution simulation of detail hairs to follow the input motion in a spatially-averaged sense. The resulting high-resolution motion adheres to the artistic intent but is enhanced with detailed deformations and dynamics generated by physics-based simulation. The technical core of our approach is formed by a set of tracking constraints, requiring the center of mass of a given subset of detail hair to maintain its position relative to a reference point on the corresponding guide hair. As a crucial element of our formulation, we introduce the concept of dynamically changing constraint targets that allow reference points to slide along the guide hairs to provide sufficient flexibility for natural deformations. We furthermore propose to regularize the null space of the tracking constraints based on variance minimization, effectively controlling the amount of spread in the hair. We demonstrate the ability of our tracking solver to generate directable yet natural hair motion on a set of targeted experiments and show its application to production-level animations.