AAAS panel explores the ethical implications of brain-machine interfaces.
At the recent meeting of the American Association for the Advancement for Science in Boston, neuroscientists outlined several lines of promising brain-machine interface research.
The next generation of brain-machine interfaces (B-MI) may rapidly enhance health and improve the quality of life for those with reduced function due to disease or disability. They may also allow people to control drones with just their thoughts or even add new human senses, which raise important ethical considerations.
At the recent meeting of the American Association for the Advancement for Science in Boston, neuroscientists outlined several lines of promising B-MI research. Advances in microprocessors, computing, and materials science, for example, have facilitated the development of “epidermal electronics,” which combine wireless communications, neural sensors, and other medical sensors into patches small and flexible enough to serve as temporary tattoos. These electronics have obvious clinical use, such as for unobtrusive monitoring of vital signs or symptoms of brain disease, said principal investigator Todd P. Coleman of the University of California, San Diego. In fact, he has created a company, Neuroverse, to commercialize this type of application. But Coleman also sees more wide-ranging deployment in the near future. His work was partially inspired by previous experiments in which people controlled virtual or model airplanes via a cap of electrodes; flexible B-MIs might provide similar abilities without tying people down to bulky electronics. Applying the tattoos near the vocal cords might also allow for subvocal wireless communication with electronics such as smartphones. “The things you can pick up non-invasively are much richer than you might imagine at first glance,” he said. “Things we thought were hoaxes and science fiction are fast approaching fruition.”
And that may only be the tip of the iceberg: Miguel Nicolelis and his colleagues at Duke University have developed a means to create entirely new sense modalities. They connected infrared light sensors to dense three-dimensional arrays of electrodes implanted into the somatosensory cortex of rats. This allowed the rats to track food by “feeling” light that they physiologically have no way to detect. Think of it as an artificially induced form of synesthesia, Nicolelis said. “The rats learned to 'touch' a source of invisible light — they acquired new modality of touch.” The researchers have already extended the research to monkeys, raising the possibility that people might eventually be able to “augment” themselves with new abilities using this technology. “When you deliver signals from devices directly to brain,” Nicolelis said, “you can create a new sensation, a new feeling.”
The ethical implications of these B-MI projects and similar technology were not lost on session participants. All medical innovations raise legal and moral questions, said neuroethicist Martha Farah of the University of Pennsylvania. However, B-MI and other fields such as neuropsychiatry that directly affect people's abilities raise particularly difficult questions about what it means to be human and what kind of relationship people have with technology. It's difficult not to draw on iconic images of cyborgs from science fiction when discussing the long-term possibilities of B-MIs, which might include providing people new ways to sense the world, methods of augmenting cognition and memory, and even the ability to communicate brain-to-brain or merge identities.
“Ethical considerations such as those raised in the B-MI panel are important for all scientists,” said Michael Zigmond, professor of neurology at the University of Pittsburgh and secretary of the AAAS Neuroscience Section. “As scientific development and technological advances increasingly change the ways we deal with the human condition, we must continue to have conversations about how those changes might affect society. Such discussions are well-informed by the wide array of scientists who attend the AAAS annual meeting and can provide valuable insights and guidance.”
Additionally, Farah noted that focusing only on “sexy sci-fi long-term issues” ignores many serious short-term challenges more relevant to the day-to-day life of brain researchers and policymakers. “I'm not dismissing concerns about radically altered human brains that push us beyond what a human being is,” she said. “Before we get there, there are some other pretty serious ethical challenges — mundane, yet very important issues,” such as funding sources, conflicts of interest, and intellectual property protection. For example, rethinking clinical trial rules and practice might be necessary. In the United States, medical devices are regulated differently than pharmaceuticals, even though B-MIs are increasingly serving as a substitute for testing and treatment. Who funds current B-MI research may also have a disproportionate influence on the field, as aggressive pursuit of patents might constrain many promising avenues of research.
Between 10 and 30 years from now, people will need to make difficult decisions about access to B-MI technology, its appropriate uses, and risks, Farah added. Cochlear implants, retinal implants, and similar devices are already used regularly, but deciding what level of impairment is appropriate for treatment is not easy — especially as temptations grow to use this technology for frank enhancement or “making a person better than normal.” B-MIs that communicate wirelessly also expose people to hackers, computer viruses, and similar cybersecurity risks. “What if they hack into your brain?” Farah asked. With B-MIs, such inference could affect eyesight, memory, or even vital functions such as heart rate. People will also have to decide how to manage the costs of B-MI technology to ensure fair access. “Undoubtedly these technologies will be available to the rich before anyone else,” Farah said. “How would we like our society to manage these? How much do we guide the scientists and the health system to enforce as much equity as we can?”