- Brain-scanning devices catch state lawmakers’ attention
- Colorado, Minnesota proposals would limit data’s use
State legislatures appear to be the first testing grounds in the US for limits to protect each individual’s most private cerebral inner workings from misuse or manipulation by neurotechnology that is rapidly advancing into consumer products.
Lawmakers in at least two states—Colorado and Minnesota—are advancing the consumer data privacy debate into largely uncharted territory, introducing laws that would establish rights and protections for information collected from electrical neural signals. They could pertain to the work of companies such as
Consumer neurotechnology ranges from wearable devices for sleep monitoring to brain-computer interfaces that enable paralyzed individuals to move robotic tools using their minds. But the tech isn’t subject to federal regulations outside of the medical context. Privacy advocates warn a regulation gap could have severe consequences if companies are allowed to deploy products without guidelines.
The state bills offer an early glimpse at which novel uses most concern lawmakers. If either proposal is enacted, other states may follow suit.
“The big deal here is that states are getting involved now and that gives our companies a small window to start developing their industry standards before the regulators and legislators get involved,” said Sara Pullen Guercio, a health-technology privacy associate at Alston & Bird LLP.
States “watch each other, and I would not be surprised if there were additional laws that come online,” Guercio said while cautioning that both proposals are still far from being enshrined as law and could change in substance before they’re enacted, or end up scuttled.
Legislation the Colorado House passed on Feb. 9 would amend that state’s comprehensive privacy law to address neurotechnology. A measure introduced in Minnesota’s legislature—which doesn’t yet have a broad consumer privacy law—would create a standalone statute focused solely on the brain-scanning technology.
Legislators in California—which, like Colorado and 11 other states, has a comprehensive privacy law—are in early conversations about neurotechnology regulations with the Neurorights Foundation, the nonprofit that informed Colorado’s bill, according to founder Rafael Yuste, also a neuroscience professor at Columbia University.
The ultimate goal, though, is federal action, said Jared Genser, Neurorights Foundation outside general counsel and managing director at Perseus Strategies. The foundation helped to push Chile in 2021 to become the first country in the world to enshrine mental privacy and free will in its national constitution, Yuste said.
Data collected by medical neural devices used for research and diagnosis like electroencephalograms, or EEGs, is protected by the federal Health Insurance Portability and Accountability Act. The US Food and Drug Administration regulates surgical implants like those offered by Elon Musk’s Neuralink Corp., which recently claimed it implanted its first brain-to-computer device in a human subject.
“Consumer protections under US law are limited, so one of the things that we think is going to be very, very important as states start to take action on this is, frankly, for the US Congress to start a look at these issues as well,” Genser said. Despite differences between the state privacy laws, success in Colorado could serve as a model for others, he said.
States’ Neural Approaches
Colorado last week became the first state to advance a neuroprivacy bill out of a legislative chamber with its proposal to amend the Colorado Privacy Act. Though it boasted bipartisan sponsors in the state Senate, complications might slow its momentum in the chamber, said Jameson Spivack, a senior policy analyst studying immersive technologies at the Future of Privacy Forum.
“Lawmakers are going to need to figure out how the biological and neural data bill interacts with both the underlying CPA and its regulations, as well as any other potential biometric bills that might pass this session,” Spivack said.
The bill Colorado House members passed covers devices that record or alter electrical signals from “an individual’s central or peripheral nervous system” and warns that even if neural data collection is consented to, consumers are “unlikely to be fully aware of the content or quantity of information they are sharing.”
Minnesota’s standalone bill, introduced into both legislative chambers, includes comparatively more prescriptive provisions by establishing a right to “cognitive liberty” and creating civil and criminal penalties for accessing brain data without an individual’s consent or to influence someone’s decision making.
“It is a lot easier to get controls and limitations in place before a technology is widely adopted, rather than trying to do so after a technology is widely adopted and in use, so my attempt with this legislation was again to try and get ahead of that,” said state Sen. Eric Lucero (R), who is championing the Minnesota bill and works as an information security consultant at Advantage Professional Networks.
Lucero first introduced the proposal in 2020, and it’s since made little progress. But he foresees brighter prospects once the state legislature’s session resumes this week, with a planned hearing on the bill and fellow lawmakers “waking up” to issues surrounding emerging technology driven by concerns with deepfakes and artificial intelligence.
“It also serves as a deterrent—there are a few things that corporate American institutions generally try to evade more than being sued and having to pay out damages unnecessarily,” said state Rep. Walter Hudson (R), who sponsored the bill in the Minnesota House.
Outside of the US, the implications of neurotechnology have also caught lawmakers’ attention. Since Chile enacted Neurorights Foundation-backed mental protections, the organization has worked on similar proposals with countries including Brazil, Mexico, and Spain, as well as the United Nations.
Cybersecurity breaches affecting neural data or “malicious hacking” of neural devices are a concern, according to a 2021 report produced by the UN’s International Bioethics Committee, which called for member states to adopt neurotechnology regulations, particularly for uses outside the medical context.
Genser of the Neurorights Foundation said his next step is to introduce the organization’s mission of protecting brain data to more US lawmakers this summer at the National Conference of State Legislatures.
“What industry or any science developer and practitioner collecting brain data can learn is that these conversations are becoming more frequent and complicated,” said Karen Rommelfanger, founder of the think tank Institute of Neuroethics and a professor at Emory University.
“The global trend we’ll likely see is that brain-related data will be considered a special category or sensitive data,” she said.
To contact the reporter on this story:
To contact the editors responsible for this story: