North Carolina State University researchers have won a $1.2 million grant from the National Science Foundation to improve educational software by enabling it to assess facial expression, body language, speech and other cues to better respond to a student's emotional state during the learning process.
"Educational software can be a valuable tool, but so far these tools don't account for student emotion or affect," says Dr. Kristy Boyer, an assistant professor of computer science at NC State and co-primary investigator (PI) of the grant. "We're planning to develop and test techniques and technologies for incorporating affect and dialogue into educational games and other software."
The ultimate goal is to develop a software tool to support the learning process by assessing a student's verbal and nonverbal cues and using that information to customize how the program responds to each student.
The first step for the researchers will be to modify an existing game, Crystal Island, to incorporate spoken dialogue and affect sensors that track eye movement, facial expressions and posture. The researchers will then use the program in middle schools to collect preliminary data on how students interact with the program, both in terms of natural language (what the students say) and nonverbal cues (what the students do).
"This preliminary data will serve as the basis for all of the subsequent modeling we do, as well as our development of techniques for how the game should respond to the student," says Dr. James Lester, a professor of computer science at NC State and PI of the grant.
The grant is for three years and focuses specifically on middle school science education, though the findings are expected to be broadly applicable to other subjects and age groups. Co-PIs on the project include Dr. Brad Mott, a senior research scientist in NC State's Department of Computer Science, and Dr. Eric Wiebe, a professor of science, technology, engineering and mathematics education at NC State.