Researchers at the University of Illinois at Urbana-Champaign have tested the hands-free approach and found that drivers -- young and old -- struggled to see dangerous scenarios appearing in front of them.
The experiments, reported in the Fall 2004 issue of the journal Human Factors, were conducted in a virtual reality suite at the Beckman Institute for Advanced Science and Technology. Eye-tracking techniques allowed researchers to see the effects of distractions.
"With younger adults, everything got worse," said Arthur F. Kramer, a professor of psychology. "What we found was that both young adults and older adults tended to show deficits in performance. They made more errors in detecting important changes and they took longer to react to the changes." The impaired reactions, he said, were "in terms of seconds, not just milliseconds, which means many yards in terms of stopping distances."
For the experiment, 14 young licensed drivers (mean age 21.4) with at least one year behind the wheel and 14 older, experienced drivers (mean age 68.4) actively engaged in a casual hands-free phone conversation. As they talked, they faced a flickering 6-foot-by-3.5-foot screen on which digitally manipulated images of Chicago traffic and architecture continually changed. Each flicker, which simulated eye movements, resulted in a change of scenery that might or might not be important to a driver -- a child running into a driver's path, a simple change in a theater sign or bright or subtle color changes.
The older adults were able to detect changes related to salience, such as colors becoming brighter. However, their ability to detect changes that should be important to a driver dipped significantly.
"For the older adults, it was quite scary in that contextual restraints no longer drove their eye-scanning strategies," Kramer said. "When they were in a conversation on a cell phone, they were no longer any faster or any more accurate in their ability to detect meaningful changes, such as a little girl running between cars in traffic, than they were able to detect changes that were not meaningful to driving safely."
Younger subjects did detect relevant changes more readily and with fewer errors than older adults, but their reaction times were slowed. "When you are driving, you often don't have extra seconds to react," Kramer said.
In another experiment, the researchers found no significant negative impairments among participants who simply listened on hands-free phones as others carried on a conversation. The subjects were 13 young adults (mean age 20.64) and 13 older adults (mean age 67.33).
Kramer theorized that the requirement to comprehend and generate speech during a conversation results in interference with the scanning of driving scenes. Comprehension, in the absence of the need to generate coherent responses, requires fewer mental resources and, therefore, does not interfere with change detection in driving scenes.
Kramer's team now is conducing similar experiments in a driving simulator.
The six co-authors of the research were Kramer, Jason S. McCarley and David E. Irwin, all of the U. of I. psychology department; graduate students Margaret J. Vais and Heather Pringle (now on the faculty at the U.S. Air Force Academy in Colorado Springs, Colo.); and David L. Strayer, a professor of psychology at the University of Utah.
General Motors and the National Institute on Aging funded the research through a grant to Kramer. McCarley was supported by a Beckman Institute Postdoctoral Fellowship.