Unesco adopts global standards on ‘wild west’ field of neurotechnology

2 hours ago 6

It is the latest move in a growing international effort to put guardrails around a burgeoning frontier – technologies that harness data from the brain and nervous system.

Unesco has adopted a set of global standards on the ethics of neurotechnology, a field that has been described as “a bit of a wild west”.

“There is no control,” said Unesco’s chief of bioethics, Dafna Feinholz. “We have to inform the people about the risks, the potential benefits, the alternatives, so that people have the possibility to say ‘I accept, or I don’t accept’.”

She said the new standards were driven by two recent developments in neurotechnology: artificial intelligence (AI), which offers vast possibilities in decoding brain data, and the proliferation of consumer-grade neurotech devices such as earbuds that claim to read brain activity and glasses that track eye movements.

The standards define a new category of data, “neural data”, and suggest guidelines governing its protection. A list of more than 100 recommendations ranges from rights-based concerns to addressing scenarios that are – at least for now – science fiction, such as companies using neurotechnology to subliminally market to people during their dreams.

“Neurotechnology has the potential to define the next frontier of human progress, but it is not without risks,” said Unesco’s director general, Audrey Azoulay. The new standards would “enshrine the inviolability of the human mind”, she said.

Billions of dollars have poured into neurotech ventures in the past few years, from Sam Altman’s August investment in Merge Labs, a competitor to Elon Musk’s Neuralink, to Meta’s recent unveiling of a wristband that allows users to control their phone or AI Ray-Bans by reading muscle movements in their wrist.

The wave of investment has brought with it a growing push for regulation. The World Economic Forum released a paper last month calling for a privacy oriented framework, and the US senator Chuck Schumer introduced the Mind Act in September – following the lead of four states that have introduced laws to protect “neural data” since 2024.

Advocates for neurotech regulation emphasise the importance of safeguarding personal data. Unesco’s standards highlight the need for “mental privacy” and “freedom of thought”.

Sceptics, however, say legislative efforts are often driven by dystopian anxieties and risk hampering vital medical advances.

“What’s happening with all this legislation is fear. People are afraid of what this technology is capable of. The idea of neurotech reading people’s minds is scary,” said Kristen Mathews, a lawyer who works on mental privacy issues at the US law firm Cooley.

From a technical perspective, neurotechnology has been around for more than 100 years. The electroencephalogram (EEG) was invented in 1924, and the first brain-computer interfaces were developed in the 1970s. The latest wave of investment, however, is driven by advances in AI that make it possible to decode large amounts of data – including, possibly, brainwaves.

“The thing that has enabled this technology to present perceived privacy issues is the introduction of AI,” said Mathews.

Some AI-enabled neurotech advances could be medically transformative, helping treat conditions from Parkinson’s disease to amyotrophic lateral sclerosis (ALS).

A paper published in Nature this summer described an AI-powered brain-computer interface decoding the speech of a paralysis patient. Other work suggests AI may one day be able to “read” your thoughts – or at least, reconstruct an image if you concentrate on it hard.

The hype around some of these advances has generated fears that Mathews said were often far removed from the real dangers. The Mind Act, for example, says AI and the “vertical corporate integration” of neurotechnology could lead to “cognitive manipulation” and “erosion of personal autonomy”.

“I’m not aware of any company that’s doing any of this stuff. It’s not going to happen. Maybe two decades from now,” she said.

The current frontier of neurotechnology lies in improving brain-computer interfaces, which despite recent breakthroughs are in their infancy – and in the proliferation of consumer-oriented devices, which Mathews said could raise privacy concerns, a bugbear of the Unesco standards. She argues, however, that creating the concept of “neural data” is too broad an approach to this issue.

“That’s the type of thing that we would want to address. Monetising, behavioural advertising, using neural data. But the laws that are out there, they’re not getting at the stuff we’re worried about. They’re more amorphous.”

Read Entire Article
Bhayangkara | Wisata | | |