Home Public Philosophy We Still Have Time to Protect the Last Frontier of Privacy

We Still Have Time to Protect the Last Frontier of Privacy

Decorative image

This time last year, I was all set to buy the Apple Vision Pro. I’ve been a long-time Apple devotee—their design, UX, and product polish never fail to pull me in. I became slightly obsessed with the Vision Pro and was desperate to pre-order for the initial launch.

But the same week I was about to hit “buy,” I started reading The Battle for Your Brain by Nita A. Farahany. That changed everything.

Farahany’s book digs deep into neurotechnology—brain-sensing wearables, cognitive-enhancement implants, and tools that peer inside your mind—and what it means for our most intimate freedom: mental privacy. Within a few pages, my enthusiasm for the Apple Vision Pro had cooled. I paused my purchase. I realized I knew very little about neurotechnology—and I needed to know more.

What Is Neurotechnology?

Neurotechnology refers to a range of tools and systems that interact directly with the brain or nervous system to monitor, decode, influence, or enhance neural activity. It includes both invasive and non-invasive methods and combines neuroscience, engineering, AI, and medicine.

Key examples include:

  • Brain-Computer Interfaces (BCIs): Allows direct communication between the brain and external devices, e.g., to control prosthetics or computers via brain signals
  • Neuroimaging: Visualizes brain activity using techniques like fMRI or PET scans
  • Neural Recording Devices: Tracks electrical activity in real time from neurons or brain regions
  • Neural Decoding: Uses machine learning to interpret neural signals and predict cognitive states or behaviors

Whilst immensely beneficial for so many purposes, the potential of this technology for ethical compromise seems clear to me. Issues like privacy, consent, cognitive liberty, inequality, mental autonomy, and the risk of exploitation by bad actors are all deeply implicated.

So, Is the Apple Vision Pro a Threat?

It would seem not. The Apple Vision Pro relies on external physiological signals—eye tracking, hand gestures, and voice—to let you navigate virtual environments. It interprets observable behaviour, not internal mental states. Okay. So far, so good.

The Real Red Flag: Decoding the Mind Itself

Now contrast that with Meta’s experimental Brain2Qwerty BCI. It uses non- or minimally invasive techniques like ECoG (electrocorticography), placing electrodes directly on the brain’s surface to record electrical activity. Then, using machine learning, it decodes imagined speech—not what you say out loud, but what you internally intend to say.

That’s a giant leap from tracking your eye movements.

Yes, it’s life-changing for individuals who are non-verbal or paralyzed. But it also raises alarm bells: What is the difference between imagined speech and my thoughts? Could this technology eventually read my inner thoughts, ideas, and feelings without my consent?

The Limits—For Now

Fortunately, we aren’t quite there yet, at least with Meta’s Brain2Qwerty and similar BCIs.

These applications are limited by our current understanding of the brain. They’re good at interpreting motor intentions—like planning to speak or move—because those signals are relatively well-understood. But they struggle with abstract thinking, silent reading, emotional processing, or spontaneous ideas.

Also, these systems need to be trained on an individual to recognize their unique neural patterns. Performing this without consent is not currently possible.

Still, the technology is developing fast—and we’re getting close enough to raise serious concerns about mental privacy, consent, and autonomy.

From Healing to Exploitation

To be clear, neurotech has the potential to transform healthcare and quality of life for many, but just as quickly as these innovations offer healing, they open the door to exploitation.

As Farahany warns, even purpose-driven or mission-led innovations can quickly become commodified.

Take Multimer’s MindRider—an EEG helmet initially created to keep cyclists safe by monitoring mental states like stress and focus. But that same brain data was later repurposed to create attention maps for advertisers, pinpointing where people were most alert and where to place ads for maximum impact.

What began as safety turned into monetization. Sound familiar?

We saw the same arc with social media. Platforms that promised connection became data-harvesting engines for targeted advertising—often harming users’ mental health in the process. So why should neurotechnology be any different?

The Rise of Neuromarketing

Anyone who’s worked in advertising knows the goal is to connect emotionally with the audience. That requires understanding how people think, feel, and decide.

In the 1990s, neuroscientist Antonio Damasio challenged the traditional Cartesian mind-body split in his work Descartes’ Error, arguing that emotion is integral to reason—a theory that strongly influenced the field of neuromarketing.

Since then, marketers have played with fMRI, EEG, eye tracking, and biometric sensors to measure emotional response to advertising and brands. With global e-commerce valued at $18.77 trillion in 2024 and projected to hit $75 trillion within a decade, interest in surveillance capitalism (a term coined by Shoshana Zuboff) is only growing.

Neurotech is shaping up to be the next gold mine. 

Who’s Regulating This? (Spoiler: Almost No One)

The neurotechnology sector is advancing outside the bounds of traditional medical regulation.

  • Consumer-focused neurotech isn’t subject to HIPAA or medical-grade oversight.
  • Companies can collect and use neural data—some of our most intimate biometric information—with few legal constraints.
  • Platforms like OpenBCI allow virtually anyone to access brain data, with minimal accountability.

Meanwhile, tech giants like Meta, Apple, and Neuralink are developing neurotech faster than policy can keep up.

Why We Need to Defend Cognitive Liberty

In Battle for Your Brain, Farahany introduces the idea of cognitive liberty: the right to think freely without intrusion, surveillance, or manipulation, describing our brains as the final frontier of privacy.

We can no longer rely solely on “the experts.” Government bodies are struggling. Industry self-regulation has proven ineffective. (Meta’s track record with consumer data and harm prevention speaks volumes, and see the recent privacy gaffe with their stand-alone AI app for evidence of their carelessness with user data.)

Some countries, like Chile, are taking cognitive rights seriously. They’ve pioneered legal frameworks to enshrine neurorights in their constitution. For most of us, however, proper regulation is still a distant prospect.

What Can We Do?

To protect cognitive liberty, the most powerful thing we can do as citizens, consumers and individuals is educate ourselves.

Yes, we must advocate for global standards that treat brain data as seriously as genetic or biometric data. But we also have tools that are improving all the time—like generative AI—that can help us stay informed and empowered. 

An added bonus is that understanding neurotechnology isn’t just important; it’s fascinating.

We’re not yet living in a dystopia where machines read our thoughts, but we’re closer than we think. The future of neurotechnology holds incredible promise, but it must be built on a foundation of rights, responsibility, and respect for cognitive freedom. Each of us needs to play a part in making that happen.

Alexandra Frye
The Digital Ethos Group

Alexandra Frye edits the Technology & Society blog, where she brings philosophy into conversations about tech and AI. With a background in advertising and a master’s in philosophy focused on tech ethics, she now works as a responsible AI consultant and advocate.

3 COMMENTS

    • Absolutely. For this, as for so many areas of technology, the key is in thinking ahead to potential outcomes and strategically planning how we develop technology with those outcomes in mind. Glad you enjoyed reading about it.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Exit mobile version