As the technology becomes more viable, we must assess the ethics of mind-reading
Mind-reading technology is rapidly moving from the realm of science fiction to reality. But as companies race to cash in on the market, concerns about the ethical implications of such developments remain
For decades, scientists have worked to decode the secrets of the brain. Whether through tracking the peaks and troughs of electrical activity via electroencephalography or examining the structure of the brain with computerised tomography and MRI scans, neuroimaging has helped us glean a greater understanding of the inner workings of the mind. Now, thanks to the rapid development of AI and machine learning, we are closer than ever to unlocking new methods of communication with so-called ‘mind-reading’ technology.
Through brain-computer interfaces (BCIs), scientists can ‘read’ an individual’s mind by opening a pathway for communication between the brain and a computer. According to Ana Matran-Fernandez, an AI industry fellow at the University of Essex who studies BCI technology, this could be a significant catalyst for change in the lives of people who are currently unable to communicate.
Action should be taken now to ensure powerful corporations, hackers and governments cannot exploit people through BCIs when the technology becomes more sophisticated
By integrating BCI technology with smart home devices, for example, we could one day give those who are fully paralysed with locked-in syndrome a renewed sense of independence and freedom. But even in the face of these potentially life-changing benefits, the ethical implications of mind-reading technology are immense.
Getting in your head
Both public and private groups have thrown money at research projects in a bid to get the nascent mind-reading industry off the ground. In 2018 alone, the US Government invested $400m in the development of neurotechnology through its BRAIN Initiative, while in 2017 the US’ Defence Advanced Research Projects Agency funnelled $65m into the development of neural implants. Elsewhere, the European Commission has put €1bn ($1.1bn) towards a 10-year programme that aims to model the entire human brain, known as the Human Brain Project.
Companies are also working to push the limits of BCI technology. In a 2017 article published by the science journal Nature, a team of neuroscientists, ethicists and machine-intelligence engineers called the Morningside Group estimated that for-profit entities were already spending $100m per year on neurotechnology development. Allied Market Research, meanwhile, has predicted the global market for BCIs will reach as much as $1.46bn by 2020.
Facebook is one of the most prominent companies currently experimenting with mind-reading technology. At an annual developer conference in 2017, Regina Dugan, then head of Facebook’s division for new hardware projects, Building 8, said users would one day be able to type on the platform using only their minds.
“It sounds impossible, but it’s closer than you may realise,” Dugan said.
Mind-reading technology industry
$1.46bn
Estimated value by 2020
$400m
Amount invested by the US Government in 2018
$100m
Amount invested by for-profit companies each year
Although Dugan left Facebook after just 18 months and Building 8 was dismantled at the end of 2018, Facebook founder Mark Zuckerberg recently reiterated his interest in BCI technology during an interview with Jonathan Zittrain, a professor of international law at Harvard Law School. Zuckerberg had previously called telepathy the “ultimate communication technology”. Other prominent names exploring BCIs include billionaire entrepreneur Elon Musk and Japanese carmaker Nissan. The former is working on a secretive project called Neuralink, which seeks to make humans “symbiotic” with AI through an implant in the brain, while the latter revealed plans for its ‘brain-to-vehicle’ technology in 2018.
Despite these thought-provoking plans, Matran-Fernandez pointed out a number of hurdles to the commercial development of BCIs, principally the cost. In fact, most of the research being conducted into BCIs is currently confined to laboratories because the necessary equipment is so expensive. But even if this technology was cheap enough for mass-market production, Matran-Fernandez believes the proprietary algorithms underlying such devices are still not up to scratch: “In practice, the headsets I’ve seen that will be cheap enough for anyone – I’m talking $100, $200 – that would be cheap enough to buy for personal use, they are not… good enough.” This is due, in part, to the fact that every brain is different, and the minds of those with conditions like locked-in syndrome have an added layer of complexity.
Matran-Fernandez told The New Economy: “I think the technology is getting there, and the algorithms are getting there. It’s just a matter of putting it all together and finding someone who’s willing to work on that and make it open-access.”
Brain trust
While the industry must tackle these practical concerns to boost the widespread adoption of mind-reading technology, serious ethical concerns about the practice continue to threaten its viability. For now, BCI technology remains relatively rudimentary, and it could be decades before it becomes advanced enough to be used in everyday life. But the Morningside Group has argued that action should be taken now to ensure powerful corporations, hackers and governments cannot influence or exploit people through BCIs when the technology becomes more sophisticated.
The group warned: “[We] are on a path to a world in which it will be possible to decode people’s mental processes and directly manipulate the brain mechanisms underlying their intentions, emotions and decisions, where individuals could communicate with others simply by thinking, and where… mental and physical abilities are greatly enhanced.” In such a scenario, the group believes the existing ethical guidelines on research and human testing – namely the Declaration of Helsinki and the Belmont Report – would be “insufficient”.
The matter of privacy and security is further complicated by the unique fragility of the human mind. In some cases where people received deep brain stimulation through electrode implants, for example, participants reported feeling a loss of identity and agency afterwards. As researchers Katherine Pratt and Eran Klein pointed out in an article penned for the Conversation, neural data is unlike other forms of personal data because it is “intimately connected to the mind and who we take ourselves to be”.
Matran-Fernandez, however, is not particularly concerned about issues of privacy and identity just yet. As she explained, no mind-reading actually takes place; instead, the process is like being outside the door of an auditorium when an orchestra is playing. In Matran-Fernandez’s analogy, the orchestra is the brain, with different instruments making up different neural activities. “From outside you can hear, in general, the music,” she said. “[But] it’s going to be distorted. You don’t hear it as if you’re sitting inside… You cannot distinguish the different instruments.”
It is similar for researchers using BCI technology. If two categories are created – ‘yes’ and ‘no’, for example – researchers can study the brain’s response to each option and distinguish between the two, but they cannot pinpoint what a participant is actually thinking about.
Mind over matter
Although mind-reading technology prompts numerous ethical debates, there are promising practical uses for BCIs, especially in the healthcare sector, where it has already had a degree of success. A consortium of leading scientists called BrainGate, for instance, is using BCI technology to help people with neurological diseases, injuries or lost limbs regain movement and the ability to communicate.
In one of the group’s case studies, paralysed individuals with sensors implanted in their brains were able to type up to eight words per minute on smartphones and tablets. In another proof-of-concept demonstration, BrainGate showed how people with chronic tetraplegia resulting from spinal cord injuries could regain limb movements with the help of BCI technology.
A separate group of researchers at Columbia University, meanwhile, was recently able to construct synthetic speech using a device called a vocoder, which deciphered what people wanted to say through brain activity alone. With more research, it is hoped that BCI devices will allow people to control wheelchairs, robotic prosthetics and paralysed limbs, as well as enabling those with severe locked-in syndrome to communicate.
Today, the very first steps are being made into these areas of research. As we develop a better understanding of the brain, there is great potential for further improvements in healthcare and medicine. While researchers must tread carefully through the ethical dilemmas these technologies pose, the adoption of mind-
reading technology throughout society appears to be a question of when, not if.