Worldwide Campaign to stop the Abuse and Torture of Mind Control/DEWs
A response to advances in neurotechnology that can read or alter brain activity, new human rights would protect people from theft, abuse and hacking
New human rights that would protect people from having their thoughts and other brain information stolen, abused or hacked have been proposed by researchers.
The move is a response to the rapid advances being made with technologies that read or alter brain activity and which many expect to bring enormous benefits to people’s lives in the coming years.
Much of the technology has been developed for hospitals to diagnose or treat medical conditions, but some of the tools – such as brainwave monitoring devices that allow people to play video games with their minds, or brain stimulators that claim to boost mental performance – are finding their way into shops.
But these and other advances in neurotechnology raise fresh threats to privacy and personal freedom, according to Marcello Ienca, a neuroethicist at the University of Basel, and Roberto Andorno, a human rights lawyer at the University of Zurich. Writing in the journal Life Sciences, Society and Policy, the pair put forward four new human rights that are intended to preserve the brain as the last refuge for human privacy.
“The question we asked was whether our current human rights framework was well equipped to face this new trend in neurotechnology,” Ienca told the Guardian. Having reviewed the rights in place today, the pair concluded that more must be done to protect people.
“The information in our brains should be entitled to special protections in this era of ever-evolving technology,” Ienca said. “When that goes, everything goes.”
The suggested new rights assert what the researchers call cognitive liberty, mental privacy, mental integrity and psychological continuity. The first of these concerns a person’s freedom to use, or refuse to use, brain stimulation and other techniques to alter their mental state. If adopted, it could defend people against employers who decide their staff might be more effective if they zapped their brains with weak electrical currents. In November last year, US military scientists reported that a procedure called transcranial direct current stimulation (tDCS) boosted the mental skills of personnel. The devices are available on the open market, but there are concerns over their safety.
The right to mental privacy is intended to plug a gap in existing legal and technical safeguards that do nothing to prevent someone from having their mind read without consent. While modern brain scanners cannot pluck thoughts from a person’s head at will, improvements in the technology are expected to reveal ever more precise information about people’s brain activity. In 2011, scientists led by Jack Gallant at the University of California in Berkeley used brain scans to reconstruct clips of films people had watched beforehand.
Today, there are no firm rules on what brain information can be gathered from people and with whom it can be shared. What Ienca and Andorno fear is “the indiscriminate leakage of brain data across the infosphere”, as happens now with the personal information people share on social media such as Facebook and Twitter.
The third right, to “mental integrity”, aims to defend against hackers who seek to interfere with brain implants, either to take control of the devices people are connected to, or to feed spurious signals into victim’s brains. The fourth right, covering “psychological continuity”, would protect people from actions that could harm their sense of identity, or disrupts the sense of being the same person throughout their life.
The use of deep brain stimulation, in which people have electrodes implanted deep into their brains to control Parkinson’s symptoms and other conditions, has already raised concerns about its impact on patients’ personal identity, with some stating that they no longer feel like themselves after the surgery.
Ienca admits that it may seem a little early to worry about brain hackers stealing our thoughts, but he said it was usually more effective to introduce protections for people sooner rather than later. “We cannot afford to have a lag before security measures are implemented,” he said. “It’s always too early to assess a technology until it’s suddenly too late.”