Skip to content

NSF Grant Funds Human Gesturing Research at UC Merced

October 8, 2009

MERCED — Human beings are complex creatures. That simple fact makes recreating human voice and action difficult in the virtual world. As realistic as computer graphics and animation can be, there is always an unnatural nuance or two that prevent viewers or users from fully believing in what they see.

Computer scientist Marcelo Kallmann and cognitive scientist Teenie Matlock at the University of California, Merced, have received about $500,000 from the National Science Foundation’s Human-Centered Computing Program to try to improve that.

The research project will develop new techniques for producing realistic human-like gestures based on data collected from people in real life, with the particular goal of being able to correctly parameterize gestures with respect to objects and arbitrary locations in a virtual environment. The end result can benefit anyone who needs to use animation or computer graphics, from computer game and movie makers to creators of online tutorials.

“The first step for us is to understand how people use gestures to instruct and demonstrate objects and actions in real situations, which will then enable us to create a model to reproduce that interaction for generic situations in the virtual world,” Kallmann said.

Matlock said that part of the complexity in duplicating human action is that no two people move the same way.

“Gestures vary across situations and individuals,” she said. “Some may make a sweeping motion as they point to the door; others will look at the door as they point; a few will point at eye level; others will point above or below. Even the same individual may point in a variety of ways in response to repeated commands.”

What excites Kallmann and Matlock, both founding faculty at UC Merced, most about this specific grant is that it will allow two graduate students coming from different disciplines to work together. David Huang (computer science) and Stephanie Huette (social and cognitive sciences) are scheduled to work full-time on the project for three years.

Matlock and Huette will create the human experiments to analyze how subjects gesture. Based on their data, Kallmann and Huang will create computer models to parameterize and generalize the gestures to new situations in the virtual world.

“Having cognitive scientists take part in this research will help with the validation of our results,” Kallmann said. “They are the experts in how and why humans do what they do.”

At the conclusion of their three-year grant, Kallmann and Matlock hope to create a database of parameterized gestures that can be used by graphic programmers and animators to make their jobs easier in a variety of applications.

“It is very costly for programmers working on design and animation to hand-code gestures, which is standard practice,” Matlock said.

“Also, gestures cannot be easily parameterized to different locations, for instance, so that a pointing motion can target an object that can be placed anywhere in an interactive application,” Kallmann added. “Our database will include computational tools to achieve such parameterizations, opening the way to achieve high-quality results in interactive training and education based on virtual environments.”

Together with colleagues from the School of Engineering, Kallmann and Matlock have also been previously awarded with other grants for collaborative research. They were awarded two major research instrumentation grants from the National Science Foundation in 2007 and 2008. One of these grants funded the visualization and motion-capture facility at UC Merced, and the other paid for the purchase of humanoid robots. In total, they have been awarded nearly $1.25 million, which are contributing to the school’s establishment of the Center on Autonomous and Interactive Systems.