Brain Computer Interface (BCI) companies are moving forward with devices and services trying to comprehend and control human neural pathways. Some medical-focused neurotechnology companies like Synchron are using surgically implanted devices to send signals to paralyzed patients’ brains to assist them in regaining function of limbs. Other consumer-focused firms are using bulky helmets. large and heavy helmets and relatively ordinary looking smart headphones to detect their users’ brain signals.
[ Related: Neuralink demonstrates the first human patient using brain implant to play online chess ]
Although the technology like this is still relatively new, neural rights activists and cornered lawmakers want to be prepared for when it becomes more widespread. Critics caution that companies may already have the capability to “decode” consumers’ data presented in brain scans and translate that into written text.
That decoded data can reveal highly sensitive details about an individual’s mental and physical well-being or their cognitive states. Researchers have already demonstrated they can use AI models to read the brain data of patients watching videos and approximately recreate the scenes those patients saw. This decoding process could become much simpler, and more precise, with the deployment of even more powerful generative AI models.
There’s also little preventing current neurotechnology companies from misusing or selling that data to the highest bidder. Nearly all (96%) of the neurotechnology companies analyzed in a recent report by the Neurorights Foundation seem to have had access to consumers’ neural data, which can include signals from an individual’s brain or spine. The Foundation argues that those companies impose meaningful restrictions to neural data access. More than half (66.7%) of the companies explicitly mention sharing consumer’s data with third parties.
A first-of-its kind US law passed in Colorado this week could change that situation by offering stricter, consumer-focused protections for all neural data gathered by companies. The law, which gives consumers much greater control over how neurotechnology companies collect and share neural data, could add momentum to other similar bills making their way through state legislatures. Lawmakers, both in the US and abroad, are in a race to establish meaningful standards around neural input data before these technologies become widespread.
Maintaining personal neural data private
Colorado law, officially named HB 24-1058 will broaden the term “sensitive data” in Colorado’s Privacy Act to include neural data. Neural data here refers to inputs created by the brain, spine, or network of nerves flowing through the body. In this context, neurotechnology companies typically obtain this data through a wearable or implantable device. These can range from relatively typical-looking headphones to wires plugged directly into a patient’s central nervous system. The expanded definition will offer the same protection to this as is currently provided to fingerprints, face scans, and other biometrics data. Like with biometric data, businesses will now need to obtain consent before collecting neural data and take steps to limit the amount of unnecessary information they gather.
Coloradans, because of the law, will have the right to access, fix, or control their neural data. They can also choose not to allow the sale of that data. The bill’s authors say these rules are important because a lot of neural data is likely gathered unintentionally or unnecessarily through neurotechnology services. According to the Neurorights Foundation report, only 16 out of 30 companies said consumers can opt out of their data being used under certain conditions.
The Colorado bill states that collecting neural data always involves revealing information without meaning to. Even if individuals agree to the collection and use of their data for a specific purpose, they probably don't fully understand what information they are sharing.
Supporters of stricter protections for neural data, like Neurorights Foundation Medical Director Sean Pauzauskie, praised Colorado’s action in a recent interview with The New York Times.
Sean Pauzauskie said, “We’ve never seen anything with this power before—to identify, categorize people and have prejudice against people based on their brain waves and other neural information.”
Who else protects neural data?
Colorado’s law could become the standard for other states. Currently, the US doesn't have federal laws limiting how companies can access or use neural data. In other states, similar bills are being considered. The California legislation is relevant because many major companies in brain computer interfaces, like Neuralink and Meta, are based there. Other countries have already gone ahead of the US in regulating neural data. In 2021, Chile became the first country to legally protect neural rights by including them in its national constitution. Since then, Brazil, Spain, Mexico, and Uruguay have also passed their own laws. Minnesota and CaliforniaAll of this growing interest in regulation may seem unusual for an industry that still seems relatively new. BCI users probably won’t be communicating their thoughts telepathically with others anytime soon, and medical applications for paralyzed or injured people are still limited. However, supporters of these early regulations hope that these efforts can set standards and potentially guide the neurotechnology industry toward a more privacy-focused future. Similar to recent discussions about social media regulations, it’s often hard to apply new rules to products and services after they have become a normal part of life. When it comes to mind-reading technology with dystopian hints, it's still mostly under control, but there are signs it could change. New law might restrict how neurotechnology companies collect and share sensitive brain data.
All of this simmering regulatory interest may seem unusual for an industry that still appears relatively nascent. BCI users likely won’t be telepathically messaging their thoughts to friends any time soon and medical applications for paralyzed or other injured peoples remain reserved to a select few. But supporters of these relatively early emerging neural regulations hope these preemptive efforts can help set standards and potentially help model the growing neurotechnology industry towards a more privacy-conscious future. And if recent debates over social media regulations are any guide, it’s often easier said than done to try and retroactively apply new regulations to products and services once they’ve already become staples of modern life. When it comes to dystopian tinged mind-reading tech, Pandora’s Box is still mostly closed, but it’s beginning to crack open.