News Release

Experts call for ethics rules to protect privacy, free will, as brain implants advance

Peer-Reviewed Publication

Columbia University

The convergence of artificial intelligence and brain-computer interfaces may soon restore sight to the blind, allow the paralyzed to move robotic limbs and cure any number of brain and nervous system disorders.

But without regulation, this flurry of innovation spells trouble for humanity, warns a team of researchers led by Columbia University neuroscientist Rafael Yuste and University of Washington bioethicist Sara Goering. In a new essay in Nature, Yuste and Goering join more than two dozen physicians, ethicists, neuroscientists, and computer scientists, in calling for ethical guidelines to cover the evolving use of computer hardware and software to enhance or restore human capabilities.

"We just want to ensure that this new technology which is so exciting, and which could revolutionize our lives, is used for the good of mankind," said Yuste, director of Columbia's Neurotechnology Center and a member of the Data Science Institute.

Long the stuff of science fiction, the melding of computers with the human mind to augment or restore brain function is moving closer to reality. The authors estimate that the for-profit brain implant industry is now worth $100 million, led by Bryan Johnson's startup Kernel, and Elon Musk's Neuralink. Under President Obama's BRAIN Initiative alone, The U.S. government has spent another $500 million since 2013, they write.

As these investments bear fruit, the authors see four main threats: the loss of individual privacy, identity and autonomy, and the potential for social inequalities to widen, as corporations, governments, and hackers gain added power to exploit and manipulate people.

To protect privacy, the authors recommend that individuals be required to opt in, as organ donors do, to sharing their brain data from their devices, and that the sale and commercial use of personal data be strictly regulated.

To protect autonomy and identity, the authors recommend that an international convention be created to define what actions would be prohibited, and to educate people about the possible effects on mood, personality and sense of self.

Finally, to address the potential for a brain-enhancement arms race pitting people with super-human intelligence and endurance against everyone else, they suggest creating culture-specific commissions to establish norms and regulations. They also recommend that military use of brain technologies be controlled, much as chemical and biological weapons are already under the Geneva Protocol.

In an earlier essay in the journal Cell, Yuste and Goering laid out similar arguments for integrating ethics into brain technologies, citing the 1970s Belmont Report which set ethical principles and guidelines for research involving human subjects.

###

Read the Nature essay: Four ethical priorities for neurotechnologies and AI

Scientist Contact: Rafael Yuste rmy5@columbia.edu

Media Contact: Kim Martineau klm32@columbia.edu, 646-717-0134

About Columbia University

Among the world's leading research universities, Columbia University in the City of New York continuously seeks to advance the frontiers of scholarship and foster a campus community deeply engaged in the complex issues of our time through teaching, research, patient care and public service. The University is comprised of 16 undergraduate, graduate and professional schools, and four affiliated colleges and seminaries in Manhattan, and a wide array of research institutes and global centers around the world. More than 40,000 students, award-winning faculty and professional staff define the University's underlying values and commitment to pursuing new knowledge and educating informed, engaged citizens. Founded in 1754 as King's College, Columbia is the fifth oldest institution of higher learning in the United States.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.