Tuesday, March 28, 2023

Neurotech’s Battles Impact Our Brains’ Future




Neurotechnologies today—devices that can measure and influence our brains and nervous systems—are growing in power and popularity. The neurotech marketplace, according to Precedence Research, is worth USD $14.3 billion this year and will exceed $20 billion within four years. Noninvasive brain-computer interfaces, brain stimulation devices, and brain-monitoring hardware (measuring alertness and attention at work, for example) are no longer just laboratory experiments and technological curios. The societal and legal implications of widespread neurotech adoption may be substantial.

Nita Farahany, professor of law and philosophy at Duke University, has written a new book, The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology, which explores how our lives may be impacted by the use of brain-computer interfaces and neural monitoring devices.

Farahany argues that the development and use of neurotech presents a challenge to our current understanding of human rights. Devices designed to measure, record and influence our mental processes, used by us or on us, may infringe on our rights to mental privacy, freedom of thought, and mental self-determination. She calls this collection of freedoms the right to cognitive liberty. Spectrum spoke with Farahany recently about the future and present of neurotech and how to weigh its promises—enhanced capabilities, for instance, including bionics and prosthetics and even a third arm—against its potential to interfere with people’s mental sovereignty.

portrait of a smiling woman on a white background Author, Nita FarahanyMerritt Chesson

IEEE Spectrum: Your book The Battle for Your Brain defines cognitive liberty as the rights to mental privacy, freedom of thought, and self-determination. Please tell us more about that.

Nita Farahany: The umbrella right, the right to cognitive liberty, is the right to self-determination over our brains and mental experiences. The ways I see that right intersecting with our existing human rights are those three that you listed. The right to mental privacy, which covers all of our mental and effective functions; the right to freedom of thought, which I think relates to complex thoughts and visual imagery, like the things we think of as “thinking;” and self-determination which is really the positive side of cognitive liberty. Mental privacy and freedom of thought are the rights from interference with our brains and mental experiences, while self-determination is the right to access information about our own brains, the right to make changes and to be able to define for ourselves what we want our brains and mental experiences to be like.

Much of your book is forward-looking, considering what current brain-computer interface technologies are capable of today and how people, businesses and governments are using them. What current BCI capabilities, in your opinion, run counter to the rights of cognitive liberty?

Farahany: I think there are two ways to think about it: there’s what a technology can actually do, and there’s the technology’s chilling effect no matter what it can actually do. If you are some authoritarian regime and you are requiring people to wear brain sensors, even if the technology did nothing, using that at scale on people has a deeply and profoundly chilling effect

But it does do something, and the something that it does is enough to also cause real harm and danger by digging into the mental privacy of individuals, particularly when it’s used to probe information, and not just brain states. I think it’s dangerous enough when you’re trying to track attention, engagement, or boredom, or disgust or simple emotional reactions. It’s even more dangerous when what you’re trying to do is use evoked potentials to understand biases and preferences.

What are some ways that people are currently using evoked brain potentials? (a.k.a. event-related potentials or ERPs) What are the possible issues with those applications?

Farahany: This technique is being used pretty widely in neuromarketing already, and has been for a while. For them it’s another marketing technique. People’s self-reported preferences have long been understood to be inaccurate and don’t reflect their buying behaviors. Using ERPs to try to decode emotional brain states of interest or attention when products are shown—this video elicited a weak response, while another elicited a stronger response, for example.

ERP techniques have also been used to try to infer people’s affinity with particular political viewpoints. When recording ERP signals from a person while presenting them with a series of statements and images about societal issues or political parties, researchers have tried to see positive or negative responses and then predict what a person’s political preferences or persuasions or likelihood of voting for a particular party or candidate is based on that information. That’s one of the potential uses and misuses, particularly when that’s done without consent, awareness or transparency, or when used for the commodification of that brain data.

The same kind of signals are used in the criminal justice system through so-called brain fingerprinting technology. Scientifically, we should worry about the analytical validity of that quite a bit, but on top of concerns about validity we should also be deeply concerned about using interrogation techniques on a criminal defendant’s brain, as if that is a normalized or legitimate function of government, as if that is a permissible intrusion into their privacy. We should worry about whether people get it right, the pseudoscience of it, and then we should worry about the very fact that it is a technology that governments think is fine to use on human minds.

Your book describes different companies developing “lie detector” devices based on functional magnetic resonance imagining (fMRI) signals. That sounds a lot like a shinier version of a polygraph, which are pretty widely understood to be inaccurate.

Farahany: And yet they drive a lot of confessions! It drives a lot of fear. Polygraphs already have a chilling effect on people. They already lead to false confessions and increased anxiety, but much less so, I think, than putting sensors on a person’s head and saying “it doesn’t matter what you say, because your brain is going to reveal the truth anyway.” That’s the future that has already arrived in countries already using this technology.

You discuss companies like SmartCap, which makes an EEG wakefulness monitor and markets it to shipping companies as a means of avoiding accidents caused by sleep deprivation. At the corporate level, how else might employers or employees use neurotechnology?

Farahany: Fatigue management has become something used at a relatively wide-scale across a number of companies internationally. When I presented this material at the World Economic Forum at Davos, I had a company that came up to me after my talk to say “we’re already using this technology. We plan on rolling it out at scale as one of the products we’re using.” I think in some ways, for things like fatigue monitoring and management, that’s not a bad use of it. If it improves safety, and the only data that’s used and extracted is quite limited, then I don’t find that to be a particularly troubling application. I worry when instead it’s used for productivity scoring or attention management or it’s integrated into wellness programs where the data being collected is not being disclosed to employees or being used to track people over time. We already talked about industries like neuromarketing, but other industries are already integrating this technology to gather brain heuristics into their workplaces at scale, and those uses are growing.

Do you think an interest in preserving cognitive liberty conflicts with the better interests of society?

Farahany: There are some aspects of cognitive liberty which are based on absolute rights, like freedom of thought, which protects a narrow category of our cognitive and effective functioning. And there are some aspects of cognitive liberty like mental privacy which is a relative right, where societal interests can in some instances be strong enough to justify intervention by the state and to limit the amount of liberty that a person can exercise.

It’s not that I think they’re in conflict, I think that it’s important to understand that individual liberties are always balanced against societal needs and interests. What I’m trying to do in the book is to show that cognitive liberty …isn’t always going to trump every interest that society has. There are going to be some instances in which we have to really find the right balance between the individual and the society at large.

Are current national and international rights frameworks and laws sufficient to protect cognitive liberty?

Farahany: I believe that the existing set of rights—privacy, freedom of thought and the collective right to self-determination—can be updated and expanded and interpreted. Human rights law is meant to evolve over time. …

A right, at the end of the day, has power on its own, but it’s really only as good as the enforcement of that right. What’s necessary is to enforce that right by looking at it in context-specific ways—in employment, in government use, in biometric use—and to understand what rules and regulations should be, how cognitive liberty translates into concrete rules and regulations worldwide.

Reference: https://ift.tt/m2xudFv

No comments:

Post a Comment

The Top 10 Energy Stories of 2024

IEEE Spectrum’ s most-read energy stories of 2024 centered on creative ways to produce, store and connect more carbon-free energy. Our re...