Walk the Talk: Coordinating Gesture with Locomotion for Concersational Characters 

 

In this paper, we consider the problem of creating rich and varied conversational behaviors for data-driven animation of walking and jogging characters.

August 31, 2016
Computer Animation and Virtual Worlds 2016

 

Authors

Yingying Wang (University of California, Davis)

Kerstin Ruhland (Trinity College Dublin)

Rachel McDonnell (Trinity College Dublin)

Michael Neff (University of California, Davis)

Carol O’Sullivan (Disney Research/Trinity College Dublin)

Walk the Talk: Coordinating Gesture with Locomotion for Conversational Characters

Abstract

Communicative behaviors are a very important aspect of human behavior and deserve special attention when simulating groups and crowds of virtual pedestrians. Previous approaches have tended to focus on generating believable gestures for individual characters and talker-listener behaviors for static groups. In this paper, we consider the problem of creating rich and varied conversational behaviors for data-driven animation of walking and jogging characters. We captured ground truth data of participants conversing in pairs while walking and jogging. Our stylized splicing method takes as input a motion captured standing gesture performance and a set of looped full body locomotion clips. Guided by the ground truth metrics, we perform stylized splicing and synchronization of gesture with locomotion to produce natural conversations of characters in motion.

Copyright Notice