‘Our notion of privacy will be useless’: what happens if technology learns to read our minds?

According to Tom Oxley, a neurosurgeon from Australia, the skull serves as a shield of privacy. The brain is our last private part.
Oxley is the CEO at Synchron, a Melbourne-based neurotech company that has successfully tested hi-tech brain implants that allow people send texts and emails purely through thought.

It became the first company, ahead of rivals like Elon Musks Neuralink (USA), to receive approval from the US Food and Drug Administration (FDA). This allowed it to conduct clinical trials for brain computer interfaces (BCIs), in the US.

Synchron has successfully implanted electrodes into the brains of paralysed patients via their blood vessels. The electrodes monitor brain activity and transmit it wirelessly to a computer. This interprets the data and allows patients to send messages and emails.

BCIs are a way for someone with disabilities to control a device through a connection between their brains and a computer.

Oxley claims that no one can see into your brain. It is only our bodies and mouths that can reveal what's inside our brains. This is a terrible situation for people who are unable to do so. We are trying to help people get out what is inside their skulls. Our sole focus is on medical problems.

BCIs are just one example of the many technologies that are being developed to treat brain disorders. Brain stimulation, which delivers targeted electric pulses to the brain to treat cognitive disorders, is another example. Other imaging techniques, such as EEG and fMRI, allow for brain monitoring in real-time.

David Grant, a senior researcher at the University of Melbourne, believes that neuroscience has the potential to make our lives better. The level of intrusion required to realize these benefits is immense.

Concerns about neurotech do not relate to the work of companies such as Synchron. In his view, it is uncontroversial to regulate medical corrections for people suffering from cognitive and sensory impairments.

He asks what would happen if these capabilities were to be used in medicine and not in a commercial setting? Grant sees this as a grim scenario, which Grant predicts will lead to a gradual and unstoppable decline in our ability to control our brains.

Although it is a hypothetical progression, it is possible. Some countries have already taken steps to protect people from this possibility.

A new type of right

Marcello Ienca was a young European bioethicist who anticipated these dangers in 2017. He suggested a new category of legal rights: neuro rights. This allows you to choose who can monitor, read, or alter your brain.

Ienca, a Professor in Bioethics at ETH Zurich, Switzerland, advises governments, the UN, OECD and the European Council on the potential impact technology might have on our understanding of what it means for us to be human.

Before Ienca introduced the idea of neuro rights, he was already convinced that our brains were in danger from the advances of neurotechnology.

According to Ienca, 2015 was the year when neurotechnology law was mainly dominated by criminal law.

While much of the discussion was theoretical, BCIs were being medically tested. Six years ago, Ienca was asked questions such as: What happens if the device fails? Is anyone responsible? Is it legal to use neurotechnology in court as evidence?

In his 20s, Ienca believed that more fundamental issues were at risk. The potential for technology to alter brain activity and decode it had the potential to change what it meant to be a person, as opposed to a nonperson.

Ienca states that while neurotech misuse is a serious concern, it is important for humanity to be protected. However, neuro rights also include the empowerment of people through advanced neuroscience and neurotechnology.

The brain is a very private place, I recognize that... Tom Oxley will change this.

Ienca states that neuro rights can be a protective as well as a positive force.

This is a viewpoint Tom Oxley shares. He believes that stopping the development and use of BCIs would be an unfair infringement on the rights of those his company is trying help.

He asks, "Is the ability of text messaging an expression of the right? He suggests that the right to use a BCI might be considered a digital right if the answer is "yes".

Grant agrees that Oxley's future privacy for our brains is worthy of the world's attention. He believes neuro rights are crucial.

Recognizing that the brain is a very private place, we used to have our skulls protecting it. This technology will change that.

Grant believes that neuro rights won't be enough to protect privacy from neurotech outside of medicine.

He says that our current concept of privacy will soon be irrelevant in the face such profound intrusions.

Headsets, which claim to increase concentration, are used in Chinese classrooms as commercial products. On Australian mine sites, caps that monitor fatigue in truck drivers have been used. These devices collect data from the brain activity of their users. Grant says it is difficult to track where and how this data is stored and even more difficult to control.

Grant considers the volume of information people share, including neurodata, to be an insurmountable obstacle for neuro rights.

It is foolish to think that we can solve this problem by passing legislation.

He admits that grants solutions to the intrusive nature of neurotech are radical. He envisions personal algorithms that act as highly specialised firewalls to protect a person from the digital world. These codes could communicate with the digital world on behalf of a person, protecting their brain from intrusion or alteration.

Many ethicists are concerned about the consequences of sharing neurodata.

Brains are the heart of everything we do, think, and say," says Stephen Rainey from Oxfords Uehiro Centre for Practical Ethics.

You don't end up in a dystopia where your brain is controlled by others and you have to do certain things. There are boring dystopias if you look at the companies interested in personal data, and especially Google and Facebook. They want to create a model of who a person is, so that it can be used. They are trying to create a model of what a person is so that it can be exploited.

Regulating

Chile does not want to take any risks with neurotechnology.

Chilean lawmakers approved in September 2021 a constitutional amendment that would enshrine mental integrity among all citizens. This was a first of its kind. The Chilean senate is also working on bills to regulate neurotechnology, digital platforms, and the use AI. The neuro rights principles of cognitive liberty, mental privacy and psychological continuity, also known as neuro rights, will be examined.

Europe is making progress towards neuro rights.

France has approved a bioethics bill this year, which protects the right of mental integrity. Spain is currently working on a bill on digital rights that includes a section on neuro rights. The Italian Data Protection Authority is examining whether mental privacy falls within the country's privacy rights.

Australia signed the OECDs recommendation on responsible innovation and neurotechnology in 2019

Promises, panic, and possible risks

Australian neuroscientist and ethicist Assoc Prof Adrian Carter of Monash University in Melbourne is described by his peers as having a good BS detector to detect the real and imagined dangers posed by neurotech. He is a self-described speculative ethicsist and looks at the possible consequences of technological advancement.

It would be foolish to assume that authoritarian governments wouldn't be interested in Stephen Rainey.

He explains that hyping neuro treatments too highly can lead to a decrease in effectiveness. Hype can also lead to unwarranted panic.

Carter says that a lot of the topics being discussed are still a ways away.

Is Mind-reading possible? It won't happen. It won't happen, at least not in the manner many people imagine. The brain is too complex. Brain computer interfaces are an example of this. Although people can control devices using their thoughts, they must train the technology to recognize specific patterns of brain activity. They don't just think and then open the door and it happens.

Carter points out, however, that many of the potential threats to neurotechnology in the future are already present in how tech companies use data every day.

Artificial intelligence and algorithms that detect temperature changes and read the movements of the eye are reading brain activity results in controlled studies. Commercial interests have used this data for years to predict, analyse and nudge behavior.

Carter points out that companies like Amazon, Google, Facebook, and Facebook have made billions from [personal data].

The dystopian scenarios that are triggered by data collection without consent aren't always as boring and predictable as Facebook ads.

Stephen Rainey, Oxford's professor of sociology, points out the Cambridge Analytica scandal in which data from 87million Facebook users was gathered without their consent. To inform Donald Trump's and Ted Cruz's political campaigns, the company created psychological voter profiles based upon people's likes.

Rainey states that this is the point where the data becomes commercially valuable and people are looking to do other things with it. That's where the real risk lies.

It is bringing the entire data economy, which was already in trouble, into the neuro space. There are potential misuses. It would be foolish to assume that authoritarian governments wouldn't be interested.

Tom Oxley said that he doesn't believe bad actors could misuse the BCI research he and his colleagues are doing.

He points out that Synchrons' initial funding was provided by the US military. They were looking to develop robot arms and legs for wounded soldiers. The chips in their brains allowed them to implant chips into Synchrons.

Although there is no indication that the US intends to weaponize the technology, Oxley states it is impossible to ignore the military background. Oxley states that if BCI is weaponized, there will be a direct brain connection to a weapon.

The US government seems to have recognized this potential. The US Bureau of Industry and Security published a memo last month about the possibility of limiting the exports of BCI technology to the US. Recognizing its entertainment and medical uses, the Bureau of Industry and Security was concerned that it could be used by military forces to enhance the capabilities and unmanned military operations of human soldiers.

It could be life-changing

Rogue actors may misuse neurotech, but that does not diminish the achievements it has already made in the medical field.

The Epworth centre for innovation and mental health at Monash University is home to Prof Kate Hoy, who oversees trials of neuro treatments for brain disorders such as treatment-resistant depression, obsessive disorder, schizophrenia, and Alzheimer's.

Transcranial magnetic stimulation (TMS) is one treatment currently being evaluated. It is an already widely used treatment for depression, and was included on the Medicare benefit list last year.

TMS's non-invasive nature is one of its greatest assets. Hoy states that people can have their treatment in the lunch hour and then go back to work.

We place a figure eight coil, which you can hold in our hands, over the area we wish to stimulate. Then we send pulses to the brain to induce electrical current and cause neurons to fire.

When we move [the pulse] towards the brain areas that are known to be involved in depression, we aim to improve brain function.

Side effects such as fatigue and memory loss that are common with brain stimulation methods, like TMS, are not present. Hoy claims that TMS can improve cognition in some patients.

Zia Liddell was 26 years old when she began TMS treatment at Epworth Centre. She had low expectations. Liddell, who has suffered from trauma-induced schizophrenia since the age of 14, has had hallucinations ever since.

My journey has taken me from being in psych wards, to taking all kinds of antipsychotics, and finally to this path of neurodiverse tech.

Liddell said that she wasn't too invested in TMS until it worked.

TMS saved my life and gave me a chance to make a living. TMS's future is mine. Zia Liddell

TMS is described by her as a gentle flick on your back, repeated and slow.

Liddell is admitted to hospital twice a year for treatment. Usually, she stays for two weeks. TMS sessions last 20 minutes and are performed in a chair while the patient watches TV or listens to music.

It is clear for her to recall the moment she realized it was working. The world was still silent when I woke up. I ran outside in my pajamas and into the courtyard to call my mom. Through tears, all I could think was "I can hear the birds Mum."

Liddell states that it is a calmening of the mind and occurs around the three-to-five-day mark of a treatment lasting two weeks.

One day, I will wake up and all will be still. I won't be distracted. I can concentrate. TMS saved my life and gave me the opportunity to earn a living. TMS's future is mine.

She isn't ignorant about the dangers that neurotech can unleashed on the rest of the world, despite the positive changes it has made in her life.

She says that there is an important discussion about where consent should be drawn.

It is possible to alter the brain chemistry of someone else, and that could have a life-changing effect. You are altering the fabric of your personality.