News Release

Can eyes on self-driving cars reduce accidents?

Cues from moving eyes could help pedestrians anticipate vehicle’s intentions

Reports and Proceedings

University of Tokyo

The Gazing Car

image: The cart was fitted with robotic eyes which could be moved in any direction, controlled by one of the research team. The windshield was covered to give the impression that there was no driver inside. view more 

Credit: Chang et al. 2022

Robotic eyes on autonomous vehicles could improve pedestrian safety, according to a new study at the University of Tokyo. Participants played out scenarios in virtual reality (VR) and had to decide whether to cross a road in front of a moving vehicle or not. When that vehicle was fitted with robotic eyes, which either looked at the pedestrian (registering their presence) or away (not registering them), the participants were able to make safer or more efficient choices.


Self-driving vehicles seem to be just around the corner. Whether they’ll be delivering packages, plowing fields or busing kids to school, a lot of research is underway to turn a once futuristic idea into reality.


While the main concern for many is the practical side of creating vehicles that can autonomously navigate the world, researchers at the University of Tokyo have turned their attention to a more “human” concern of self-driving technology. “There is not enough investigation into the interaction between self-driving cars and the people around them, such as pedestrians. So, we need more investigation and effort into such interaction to bring safety and assurance to society regarding self-driving cars,” said Professor Takeo Igarashi from the Graduate School of Information Science and Technology. 


One key difference with self-driving vehicles is that drivers may become more of a passenger, so they may not be paying full attention to the road, or there may be nobody at the wheel at all. This makes it difficult for pedestrians to gauge whether a vehicle has registered their presence or not, as there might be no eye contact or indications from the people inside it.


So, how could pedestrians be made aware of when an autonomous vehicle has noticed them and is intending to stop? Like a character from the Pixar movie Cars, a self-driving golf cart was fitted with two large, remote-controlled robotic eyes. The researchers called it the “gazing car.” They wanted to test whether putting moving eyes on the cart would affect people’s more risky behavior, in this case, whether people would still cross the road in front of a moving vehicle when in a hurry.


The team set up four scenarios, two where the cart had eyes and two without. The cart had either noticed the pedestrian and was intending to stop or had not noticed them and was going to keep driving. When the cart had eyes, the eyes would either be looking towards the pedestrian (going to stop) or looking away (not going to stop).


As it would obviously be dangerous to ask volunteers to choose whether or not to walk in front of a moving vehicle in real life (though for this experiment there was a hidden driver), the team recorded the scenarios using 360-degree video cameras and the 18 participants (nine women and nine men, aged 18-49 years, all Japanese) played through the experiment in VR. They experienced the scenarios multiple times in random order and were given three seconds each time to decide whether or not they would cross the road in front of the cart. The researchers recorded their choices and measured the error rates of their decisions, that is, how often they chose to stop when they could have crossed and how often they crossed when they should have waited.


“The results suggested a clear difference between genders, which was very surprising and unexpected,” said Project Lecturer Chia-Ming Chang, a member of the research team. “While other factors like age and background might have also influenced the participants’ reactions, we believe this is an important point, as it shows that different road users may have different behaviors and needs, that require different communication ways in our future self-driving world.


“In this study, the male participants made many dangerous road-crossing decisions (i.e., choosing to cross when the car was not stopping), but these errors were reduced by the cart’s eye gaze. However, there was not much difference in safe situations for them (i.e., choosing to cross when the car was going to stop),” explained Chang. “On the other hand, the female participants made more inefficient decisions (i.e., choosing not to cross when the car was intending to stop) and these errors were reduced by the cart’s eye gaze. However, there was not much difference in unsafe situations for them.” Ultimately the experiment showed that the eyes resulted in a smoother or safer crossing for everyone.


But how did the eyes make the participants feel? Some thought they were cute, while others saw them as creepy or scary. For many male participants, when the eyes were looking away, they reported feeling that the situation was more dangerous. For female participants, when the eyes looked at them, many said they felt safer “We focused on the movement of the eyes but did not pay too much attention to their visual design in this particular study. We just built the simplest one to minimize the cost of design and construction because of budget constraints,” explained Igarashi. “In the future, it would be better to have a professional product designer find the best design, but it would probably still be difficult to satisfy everybody. I personally like it. It is kind of cute.”


The team recognizes that this study is limited by the small number of participants playing out just one scenario. It is also possible that people might make different choices in VR compared to real life. However, “Moving from manual driving to auto driving is a huge change. If eyes can actually contribute to safety and reduce traffic accidents, we should seriously consider adding them. In the future, we would like to develop automatic control of the robotic eyes connected to the self-driving AI (instead of being manually controlled), which could accommodate different situations.” said Igarashi.  “I hope this research encourages other groups to try similar ideas, anything that facilitates better interaction between self-driving cars and pedestrians, which ultimately saves people’s lives.”



Paper Title: 

Chia-Ming Chang, Koki Toda, Xinyue Gui, Stela H. Seo, and Takeo Igarashi. 2022. Can Eyes on a Car Reduce Traffic Accidents? In 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI ’22), September 17–20, 2022, Seoul, South Korea. ACM, New York, NY, USA, 16 pages. 



This work was supported by JST CREST Grant Number JPMJCR17A1, Japan.

The self-driving golf cart is provided by Tier IV, Inc.


Useful Links:

Graduate School of Information Science and Technology:

Project page:

Research group: 

Project Video:


Research Contact:

Professor Takeo Igarashi

Department of Creative Informatics

Graduate School of Information Science and Technology, The University of Tokyo, 

7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033, Japan


Project Lecturer Chia-Ming Chang

Department of Creative Informatics

Graduate School of Information Science and Technology, The University of Tokyo, 

7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033, Japan


Press contact:
Mrs. Nicola Burghall
Public Relations Group, The University of Tokyo,
7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8654, Japan


About the University of Tokyo
The University of Tokyo is Japan's leading university and one of the world's top research universities. The vast research output of some 6,000 researchers is published in the world's top journals across the arts and sciences. Our vibrant student body of around 15,000 undergraduate and 15,000 graduate students includes over 4,000 international students. Find out more at or follow us on Twitter at @UTokyo_News_en.

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.