The wearable neurotechnology market targets consumers with promises of cognitive benefit and personal wellness. Scientific evidence is essential to substantiate claims about utility, safety, and efficacy and for informed choice and public trust.
nhance your memory. Boost your mood. Lose weight more easily. Learn golf, CrossFit, or the clarinet faster.
Or even: Regain more independence, despite a neurodegenerative disease. Or start recognizing family members, despite dementia.
“For example: 82 years old lady started last week, after using Bellabee’s [headset] for only 3 weeks, once again recognizing her family members,” the makers wrote in an update shared on their IndieGoGo page.
The claims come from companies in the booming business of direct-to-consumer brain wearables. Some of the devices deliver zaps of electricity to the brain, while others monitor brain activity and relay that information in real time in an app. They often look like high-tech headphones or sleek, stick-on devices. The wearables are touted as ways to curb stress, sleep better, boost creativity or athleticism, or even address serious medical conditions. But a growing number of experts are urging device makers to be more careful about those claims.
“Consumers and the public deserve to have messages be as clear as possible when it comes to their brains and brain data,” said Judy Illes, a neurologist and neuroethics professor at the University of British Columbia. Illes and her colleagues published a paper this month in Neuron examining the claims made about brain stimulation and recording devices sold directly to consumers.
Of the 41 direct-to-consumer devices they examined, 22 were recording devices that provide electroencephalograms, which are commonly called EEGs and monitor electrical activity in the brain. The remaining 19 delivered what’s known as transcranial electrical stimulation.
They found there were four big buckets that the claims fit into: health, wellness, enhancement, and practical applications such as promoting safety by alerting drivers when they’re nodding off. Claims about wellness and enhancement — like that a device can ease stress or improve concentration — were the most common.
But nine of the devices made claims about health conditions, including PTSD, depression, chronic pain, and ALS.
“They really are reaching to a population of people that may be vulnerable, such as people [who have] depression, anxiety, or attention disorders,” Illes said.
Under federal law, companies can’t claim that a product can treat, prevent, or cure a disease unless they have the evidence to back that up. But experts said some companies are just pushing the envelope with their claims. They might mention a condition in their marketing materials without explicitly saying their product can impact it.
“What you’re tending to see is a cavalier approach [to] how far they can make particular claims before they jump into a pool of transgression,” said James Giordano, a neurologist and neuroethicist at Georgetown University Medical Center. Giordano has consulted for two device makers, Thync and Halo.
“It’s the wink-wink, nudge-nudge marketing approach,” he added.
In recent years, the Federal Trade Commission has cracked down on “brain training” programs that claimed to improve brain functioning. In 2016, the developer of the popular Lumosity “brain training” games agreed to pay $2 million in refunds to settle federal charges that it deceived customers about the health benefits of its apps and online programming. But experts said they haven’t seen that same kind of action from regulators in the brain wearables business.
“There are more and more companies interested in this space, because there’s been little regulatory action,” said Anna Wexler, a University of Pennsylvania medical ethicist who has studied direct-to-consumer neurotechnology.
There are some safety risks with the devices. Some users have reported skin burns with stimulation devices, but “nobody’s electrocuting themselves,” Wexler said. But there are other issues that have Wexler and other experts worried — such as whether consumers are using them to treat mental health conditions or other issues that might warrant medical care.
“There’s these other kinds of harm: the opportunity cost, the misleading claims. I don’t think those should be ignored,” Wexler said.
In many cases, there isn’t strong evidence to support the claims. Marketing materials for just eight of the 41 devices that Illes and her colleagues examined included a link to a relevant, peer-reviewed research paper. Of the rest, 29 had links to general research or explanations of scientific concepts that weren’t specific to their products. Other companies linked to user testimonials or in-house research. That lack of clear, independent analysis can make it difficult for potential users to know exactly what they’re getting with a given device.
“These devices look scientific. They seem like they’re based on science. Consumers don’t really have a way of assessing the claims these devices are making,” Wexler said.
And the device websites often don’t carry warnings or clearly outline the caveats that come with the products. That’s where experts see room for improvement. Device makers can tell customers not to use their products in place of traditional medical care and make sure they’re aware that there isn’t long-term safety data on brain stimulation with DTC devices. Illes said it’s also critical for companies to be transparent about the proven benefits of a product and the “aspirational” claims.
“We feel there really is a place, given that were talking about brains, for companies to step up and be especially vigilant about the claims they’re making,” Illes said.
Leave a Reply