Controllable Neural Style Transfer for Dynamic Meshes
In this paper we propose a novel mesh stylization technique that improves previous NST works in several ways. First, we replace the standard Gram-Matrix style loss by a Neural Neighbor formulation that enables sharper and artifact-free results.
July 28th, 2024
SIGGRAPH 2024
Authors
Guilherme G. Haetinger (DisneyResearch|Studios)
Jingwei Tang (DisneyResearch|Studios)
Raphael Ortiz (DisneyResearch|Studios)
Paul Kanyuk (Pixar Animation Studios)
Vinicius Azevedo (DisneyResearch|Studios)
Controllable Neural Style Transfer for Dynamic Meshes
In recent years, animation movies are shifting from realistic representations to more stylized depictions that support unique design languages. To favor that, recent works implemented a Neural Style Transfer (NST) pipeline that supports the stylization of 3D assets by 2D images. In this paper we propose a novel mesh stylization technique that improves previous NST works in several ways. First, we replace the standard Gram-Matrix style loss by a Neural Neighbor formulation that enables sharper and artifact-free results. To support large mesh deformations, we reparametrize the optimized mesh positions through an implicit formulation based on the Laplace-Beltrami operator that better captures silhouette gradients that are common in inverse differentiable rendering setups. This reparametrization is coupled with a coarse-to-fine stylization setup, which enables deformations that can change large structures of the mesh. We provide artistic control through a novel method that enables directional and temporal control over synthesized styles by a guiding vector field. Lastly, we improve the previous time- coherency schemes and develop an efficient regularization that controls volume changes during the stylization process. These improvements enable high quality mesh stylizations that can create unique looks for both simulations and 3D assets.