When Stephanie Huette was working on her undergraduate degree at the University of Iowa, she couldn't have found Merced, Calif., on a map if she tried. She was, however, quite familiar with Michael Spivey through reading his book, "The Continuity of Mind."
Spivey, a professor of cognitive science , has a long history of studying language and visual perception. According to Huette, he also has a knack for communicating his research in a way that anybody can understand. Not a small feat when it comes to such a complex subject as why humans do what they do.
"I knew I wanted to research language perception in grad school," Huette said. "When I found out what UC Merced offered in cognitive science research - and that Michael Spivey and Teenie Matlock were a part of it - I knew this is where I wanted to be."
Both Spivey and Matlock use eye-tracking and mouse-tracking equipment to study how humans perceive and respond to what they hear and see. Historically, one method of evaluating perception is based on how subjects respond to questions on surveys. There is only one problem with the accuracy of such responses: humans lie.
Motion-tracking software and hardware document not only the subjects' final answers but also the answers they considered along the way. The end result is a more accurate representation of how the human brain processes information.
"People can lie on a survey, but you can't deny where your hand or eye moves when your motion is tracked by sophisticated equipment," she said.
Now in her second year at UC Merced, Huette has found UC Merced delivered well on its promise of providing a unique learning experience. Though she came here to research language perception specifically, she's gotten the opportunity to broaden the scope of her work.
A National Science Foundation grant awarded to Matlock and computer scientist Marcelo Kallmann will pay for Huette to study human gesturing for the next three years. The grant also funds a graduate researcher in computer science, David Huang of China.
"There is no other university that would provide the opportunity for a cognitive scientist to work with engineers ," Huette said. "That's important for me as a graduate researcher. Engineers ask different questions and have access to different resources. This is a rare opportunity for me that will make me highly marketable when I finish my graduate work."
The team's goal is to develop new techniques for producing realistic human-like gestures based on data collected from people in real life, with the particular goal of being able to correctly parameterize gestures with respect to objects and arbitrary locations in a virtual environment.
For their role in the project, Huette and Matlock will design and run experiments on how humans gesture. Kallmann and Huang will use that data to duplicate human movement in the virtual world.
It's definitely a group effort. The cognitive scientists have the burden of determining the right commands and tasks to elicit the most natural gestures from the study subjects, while the computer scientists must take that data and run it through an equation to duplicate human action in the virtual world.
"The idea is that we could make virtual agents point anywhere in a way that looks as fluid and natural as human movement," she said.