News Release

Aeroplanes would be safer if cockpits were more human-friendly, says new study

Peer-Reviewed Publication

Newcastle University

Aircraft could achieve an even higher level of safety if cockpit designers took more of the psychological characteristics of pilots into account, according to researchers.

Although the air accident rate has been constantly decreasing over the last decades, many modern aircraft have computerised controls systems which are so complex that they even over-tax the mental capabilities of fully-trained pilots, say the researchers.

The team, from the University of Newcastle upon Tyne and the University of York, UK, report their findings in the January edition of the International Journal of Human-Computer Studies*.

They say that, during emergencies, pilots are overloaded with technical information which they are unable to process quickly enough. This could mean that wrong decisions are made at crucial moments – and these cannot be reversed.

Dr Denis Besnard, Dr Gordon Baxter and David Greathead analysed the disaster in which a British Midland aeroplane bound for Belfast crashed onto the M1 near the village of Kegworth in Leicestershire on January 8, 1989, killing 47.

Researchers, who carefully studied the Air Accident Investigation Branch report, say the British Midland flight crew made crucial decisions based on an 'over-simplified picture of reality'. During the incident the flight crew reacted to a mechanical fault in the left engine by mistakenly shutting down the right engine. They thought they had made the correct decision because this action coincided with a cessation of vibrations and fumes from the left engine and they thus sought to make an emergency landing at East Midlands airport.

During the approach, however, the crew lost power in the left engine and a fire started. The crew attempted to restart the right engine but they were too late and the plane crash-landed half a mile from the runway.

Researcher Dr Denis Besnard, from Newcastle University's School of Computing Science, said: "Because there are so many parameters to supervise, operators controlling highly demanding processes, such as aircraft piloting, work on a simplified picture of reality. This picture, called a mental model, is very fragile and fallible.

"The pilots of the B737 were caught in what is known as a confirmation bias where, instead of looking for contradictory evidence, humans tend to overestimate consistent data.

"A potential consequence of the confirmation bias is that people overlook and sometimes unconsciously disregard data they cannot explain."

Historically, classical aircrafts had cables and mechanical devices linking the cockpit with parts such as the flaps, slats, elevators and rudder, meaning the pilot was physically linked with his machine.

However, in the late 1980s, aircrafts became 'fly-by-wire' aircrafts, which means that pilots modify the configuration of the aircraft via electrical and hydraulic devices.

Dr Besnard said: "At the same time, onboard computers started to manage a number of functions and modes aiming at increasing safety and reducing workload. Unfortunately, the workload is still very high and the complexity of the cockpit has dramatically increased."

A Federal Aviation Administration report has said pilots have to train extensively partly because they have to adapt themselves to cockpits but states the reverse should be the case.

Dr Besnard added: "Any situation, including time critical emergencies, should be processed within human information processing capacities. This is far from being the case at the moment."

He said onboard computers should be designed so as to anticipate problems and offer pilots both solutions and error recovery mechanisms, instead of the current situation where computers sometimes take unexpected decisions and provide raw information which the flight crew must analyse.

The Newcastle and York research team is part of a six-year, £8m. project, DIRC. It is sponsored by the Engineering and Physical Sciences Research Council (EPSRC), is led by Newcastle University and is looking at the dependability of large computer-based systems.

###

* Besnard. D, Greathead, D & Baxter, G (2004). When mental models go wrong. Co-occurrences in dynamic, critical systems. International Journal of Human-Computer Studies, 60, 117-128. Copies are available on request to Denis Besnard (see below)

CONTACT FOR FURTHER INFORMATION: Dr Denis Besnard 44-191-222-8058 denis.besnard@newcastle.ac.uk NB AVAILABILITY: 10am-2pm Tuesday January 6 and 10am-12pm Wednesday January 7.

Notes to editors:

1. Fifty researchers are taking part in the DIRC project, which is led by Newcastle University. They come from five universities (Newcastle, York, Edinburgh, Lancaster and City Universities). Other projects have included the designing and testing of a computerised parcel tracking system, examining ways of profiling the skills of IT specialists and provided advice for doctors caring for premature babies http://www.dirc.org.uk/

ISSUED BY NEWCASTLE UNIVERSITY PRESS OFFICE. Tel.: 44-191-222-7850 Email: press.office@ncl.ac.uk Website: http://www.ncl.ac.uk/press.office


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.