Shaping Strands with Neural Style Transfer

In this paper we propose the first stylization pipeline to support hair and fur. Through a carefully tailored fur/hair representation, our approach allows complex, 3D consistent and temporally coherent grooms that are stylized using style images.

December 14, 2025
ACM SIGGRAPH Asia (2025)

 

Authors

Beyzanur Coban (ETH Zurich)

Pascal Chang (ETH Zurich / DisneyResearch|Studios)

Guilherme G. Haetinger (ETH Zurich / DisneyResearch|Studios)

Jingwei Tang (DisneyResearch|Studios)

Vinicius Azevedo (DisneyResearch|Studios)

Shaping Strands with Neural Style Transfer

Abstract

The intricate geometric complexity of knots, tangles, dreads and clumps require sophisticated grooming systems that allow artists to both realistically model and artistically control fur and hair systems. Recent volumetric and 3D neural style transfer techniques provided a new paradigm of art directability, allowing artists to modify assets drastically with the use of single style images. However, these previous 3D neural stylization approaches were limited to volumes and meshes. In this paper we propose the first stylization pipeline to support hair and fur. Through a carefully tailored fur/hair representation, our approach allows complex, 3D consistent and temporally coherent grooms that are stylized using style images.

Copyright Notice