It felt like magic: As I moved my head and eyes across the computer screen, the cursor moved with me. My goal was to click on pictures of targets on the display. Once the cursor reached a target, I would blink causing it to click on the target—as if it were reading my mind.
Of course, that’s essentially what was happening. The headband I was wearing picked on my brain, eye, and facial signals. This data was fed through an AI-software that translated it into commands for the cursor. This allowed me to control what was on the screen, even though I didn’t have a mouse or a trackpad. I didn’t need them. My mind was doing all of the work.
“The brain, eye, and face are great generators of electricity,” Naeem Kemeilipoor, the founder of brain-computer interface (BCI) startup AAVAA, told The Daily Beast at the 2024 Consumer Electronics Show. “Our sensors pick up the signals, and using AI we can interpret them.”
ADVERTISEMENT
The headband is just one of AAVAA’s products that promises to bring non-invasive BCIs to the consumer market. Their other devices include AR glasses, headphones, and earbuds that all essentially accomplish the same function: reading your brain and facial signals to allow you to control your devices.
While BCI technology has largely remained in the research labs of universities and medical institutions, startups like AAVAA are looking for ways to put them in the hands—or, rather, on the heads—of everyday people. These products go beyond what we typically expect of our smart devices, seamlessly integrating our brain with technology around us. They also offer a lot of hope and promise for people with disabilities or limited mobility—allowing them to interact with and control their computers, smartphones, and even wheelchairs.
However, BCIs also blur the lines between the tech around us and our very minds. Though they can be helpful for people with disabilities, their widespread use and adoption raises questions and concerns about privacy, security, and even a user’s very personhood. Allowing a device to read our brain signals throws open the doors to these ethical considerations so, as they steadily become more popular, they could become more dangerous as well.
Work All Day, Sleep All Night
BCIs loomed large all throughout CES 2024—and for good reason. Beyond being able to control your devices, wearables that could read brain signals also promised to provide greater insights into users’ health, wellness, and productivity habits.
There were also a number of devices targeted at improving sleep quality such as the Frenz Brainband. The headband measures users’ brainwaves, heart rate, and breathing (among other metrics) to provide AI-curated sounds and music to help them fall asleep.
“Every day is different and so every day your brain will be different,” a Frenz spokesperson told The Daily Beast. “Today, your brain might feel like white noise or nature sounds. Tomorrow, you might want binaural beats. Based on your brain’s reactions to your audio content, we know what’s best for you.”
To produce the noises, the headband used bone conduction, which converts audio data into vibrations on the skull that travel to the inner ear producing sound. Though it was difficult to hear clearly on the crowded show floor of CES, the headband managed to produce soothing beats as I wore them in a demo.
“When you fall asleep, the audio automatically fades out,” the spokesperson said. “The headband keeps tracking all night, and if you wake up, you can press a button on the side to start the sounds to put you back to sleep.”
However, not all BCIs are quite as helpful as they might appear. For example, there was MW75 Neuro, a pair of headphones from Master and Dynamic that purports to read your brain’s electroencephalogram (EEG) signals to provide insights on your level of focus. If you become distracted or your focus wanes for whatever reason, it alerts you so you can maintain productivity.
Sure, this might seem helpful if you’re a student looking to squeeze in some more quality study time or a writer trying to hit a deadline on a story, but it’s also a stark and grim example of late-stage capitalism and a culture obsessed with work and productivity. While this technology is relatively new, it’s not difficult to imagine a future where these headphones are more commonplace and—potentially—required by workplaces.
An Ethical Minefield
When most people think about BCIs, they typically think of brain-chip startups like Synchron and Neuralink. (Neuralink requires users to undergo an invasive surgery to implant a device in the patient's brain; while Synchron requires a minimally-invasive procedure similar to a stent placed in a patient's chest.) However, these technologies require users to undergo invasive surgeries in order to implant the technology. Non-invasive BCIs from the likes of AAVAA, on the other hand, require just a headband or headphones.
That’s what makes them so promising, Kemeilipoor explained. No longer would it be limited to only those users who really need it like those with disability issues. Any user can pop on the headband and start scrolling on their computer or turning their lamps and appliances on and off.
“It’s out of the box,” he explained. “We’ve done the training [for the BCI] and now it works. That’s the beauty of what we do. It works right out of the box—and it works for everyone.”
However, the fact that it can work for everyone is a top concern for ethical experts. Technology like this creates a minefield of potential privacy issues. After all, these companies may potentially have completely unfettered access to data from our literal brains. This is information that can be bought, sold, and used against consumers in an unprecedented way.
One comprehensive review published in 2017 in the journal BMC Medical Ethics pointed out that privacy is a major concern for potential users for this reason. “BCI devices could reveal a variety of information, ranging from truthfulness, to psychological traits and mental states, to attitudes toward other people, creating potential issues such as workplace discrimination based on neural signals,” the authors wrote.
To their credit, Kemeilipoor was adamant that AAVAA would and does not have access to individual brain signal data. But the concerns are still there, especially since there are notable examples of tech companies misusing user data. For example, Facebook has been sued multiple times for millions of dollars for storing users’ biometric data without their knowledge or consent. (They’re certainly not the only company doing this either.)
These issues aren’t going to go away—and they’ll be further exacerbated by the infusion of technology and the human brain. This is a phenomenon that also brings up concerns about personhood as well. At what point, exactly, does the human end and the computer begin once you are able to essentially control devices as an extension of yourself like your arms or legs?
“The question—is it a tool or is it myself?—takes on an ethical valence when researchers ask whether BCI users will become ‘cyborgs,’” the authors wrote. They later added that some ethical experts worry that “being more robotic makes one less human.”
Yet, the benefits are undeniable—especially for those for whom BCIs could give more autonomy and mobility. You’re no longer limited by what you can do with your hands. Now, you can control the things around you simply by looking in a certain direction or moving your face in a specific way. It doesn’t matter if you’re in a wheelchair or completely paralyzed. Your mind is the limit.
“This type of technology is like the internet of humans,” Kemeilipoor said. “This is the FitBit of the future. Not only are you able to monitor all your biometrics, it also allows you to control your devices—and it’s coming to market very soon.”
It’s promising. It’s scary. And it’s also inevitable. The biggest challenge that we all must face is that—as these devices become more popular and we gradually give over our minds and bodies to technology—we don’t lose what makes us human in the first place.