News Release

Columbia University creates guidelines to ensure ethical development of neurotechnology

Columbia professor creates NeuroRights and a technocratic oath

Business Announcement

Data Science Institute at Columbia

Brain-computer interfaces may soon have the power to decode people's thoughts and interfere with their mental activity. Even now the interfaces, or BCIs, which link brains directly to digital networks, are helping brain-impaired patients and amputees perform simple motor tasks such as moving a cursor, controlling a motorized wheelchair, or directing a robotic arm. And noninvasive BCI's that can understand words we want to type and place them onto screens are being developed.

But in the wrong hands, BCIs could be used to decode private thoughts, interfere with free will, and profoundly alter human nature.

To counter that possibility, Columbia University professor of biological sciences and Data Science Institute member Rafael Yuste founded the NeuroRights Initiative, which advocates for the responsible and ethical development of neurotechnology. The initiative puts forth ethical codes and human rights directives that protect people from potentially harmful neurotechnologies by ensuring the benign development of brain-computer interfaces and related neurotechnologies.

"This revolution in neurotechnology has enormous benefits for science, medicine, and society, but could also potentially enable us to decipher and manipulate mental activity," says Yuste, who is also co-director of Columbia's NeuroTechnology Center. "As has happened before, novel technologies offer us powerful tools that can be used for good or for bad. It is up to us as a society to act to ensure that neurotechnologies are used for the benefit of humanity, not to its detriment."

Governments, militaries, and tech companies are already investing billions of dollars in neurotechnologies that could have the capacity to decipher or augment the activity of human brains, Yuste says. These technologies can be implanted into a person's brain by circuits and electrodes, or could noninvasively link the brain to the internet by wearable devices.

To safeguard the development of such devices, Yuste and members of the Neurotechnology Center proposed five NeuroRights that will help policymakers, technologists, and scientists regulate neurotechnologies. More recently, the team drafted a technocratic oath, inspired by the Hippocratic oath, that it hopes will be adopted by engineers, scientists, and entrepreneurs working on neurotechnologies.

The five NeuroRights are:

The Right to Personal Identity: Boundaries must be developed to prohibit technology from disrupting the sense of self or blurring the line between a person's internal processing and external technological inputs;

The Right to Free Will: People should control their own decision making, without manipulation from external neurotechnologies;

The Right to Mental Privacy: Data obtained from measuring neural activity should be kept private, and the sale, commercial transfer, and use of neural data should be strictly regulated;

The Right to Equal Access to Mental Augmentation: Global guidelines should be established to ensure the fair and just access to mental-enhancement neurotechnologies that could further increase societal and economic disparities; and

The Right to Protection from Algorithmic Bias: Countermeasures to combat bias should be the norm for machine learning when used within neurotechnology devices, and algorithm design should include input from user groups to counter bias.

In the 1940s, Columbia scientists helped split the atom and develop the atomic bomb, Yuste says, but it took more than a decade before United Nations officials created a commission to regulate nuclear armaments.

"Similarly, recent technologies based on artificial intelligence and algorithms were developed before we established guidelines to ensure their ethical use," he says. "But we can't afford to wait and be behind when it comes to creating ethics for neurotechnology, because this time we risk altering precisely those qualities, our personal identity, mental privacy, intellectual capacity and free will, that make us human."

###


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.